ansible-playbook 2.9.27 config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.9/site-packages/ansible executable location = /usr/local/bin/ansible-playbook python version = 3.9.19 (main, May 16 2024, 11:40:09) [GCC 8.5.0 20210514 (Red Hat 8.5.0-22)] No config file found; using defaults [WARNING]: running playbook inside collection fedora.linux_system_roles statically imported: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'jsonl', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: tests_luks.yml ******************************************************* 1 plays in /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml PLAY [Test LUKS] *************************************************************** TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:2 Friday 17 April 2026 20:09:11 -0400 (0:00:00.439) 0:00:00.439 ********** ok: [managed-node16] META: ran handlers TASK [Enable FIPS mode] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:20 Friday 17 April 2026 20:09:15 -0400 (0:00:04.036) 0:00:04.475 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reboot] ****************************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:28 Friday 17 April 2026 20:09:16 -0400 (0:00:00.384) 0:00:04.859 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Enable FIPS mode - 2] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:39 Friday 17 April 2026 20:09:16 -0400 (0:00:00.774) 0:00:05.634 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reboot - 2] ************************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:43 Friday 17 April 2026 20:09:17 -0400 (0:00:00.370) 0:00:06.004 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure dracut-fips] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:53 Friday 17 April 2026 20:09:17 -0400 (0:00:00.404) 0:00:06.409 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Configure boot for FIPS] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:59 Friday 17 April 2026 20:09:18 -0400 (0:00:00.428) 0:00:06.837 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reboot - 3] ************************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:68 Friday 17 April 2026 20:09:18 -0400 (0:00:00.329) 0:00:07.167 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role] ************************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:72 Friday 17 April 2026 20:09:18 -0400 (0:00:00.430) 0:00:07.597 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node16 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Friday 17 April 2026 20:09:19 -0400 (0:00:00.272) 0:00:07.869 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Friday 17 April 2026 20:09:19 -0400 (0:00:00.504) 0:00:08.374 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 April 2026 20:09:20 -0400 (0:00:00.692) 0:00:09.066 ********** ok: [managed-node16] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Friday 17 April 2026 20:09:22 -0400 (0:00:02.574) 0:00:11.640 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 April 2026 20:09:23 -0400 (0:00:00.317) 0:00:11.957 ********** ok: [managed-node16] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 April 2026 20:09:25 -0400 (0:00:01.934) 0:00:13.892 ********** skipping: [managed-node16] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node16] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 April 2026 20:09:25 -0400 (0:00:00.610) 0:00:14.502 ********** ok: [managed-node16] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 April 2026 20:09:28 -0400 (0:00:02.324) 0:00:16.827 ********** ok: [managed-node16] => { "ansible_facts": { "__storage_is_ostree": false }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 April 2026 20:09:28 -0400 (0:00:00.266) 0:00:17.093 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 April 2026 20:09:28 -0400 (0:00:00.177) 0:00:17.271 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Friday 17 April 2026 20:09:28 -0400 (0:00:00.169) 0:00:17.440 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 April 2026 20:09:29 -0400 (0:00:00.881) 0:00:18.322 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Friday 17 April 2026 20:09:29 -0400 (0:00:00.278) 0:00:18.600 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Friday 17 April 2026 20:09:30 -0400 (0:00:00.265) 0:00:18.866 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Friday 17 April 2026 20:09:35 -0400 (0:00:05.415) 0:00:24.281 ********** ok: [managed-node16] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 17 April 2026 20:09:35 -0400 (0:00:00.278) 0:00:24.560 ********** ok: [managed-node16] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 April 2026 20:09:36 -0400 (0:00:00.230) 0:00:24.790 ********** ok: [managed-node16] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Friday 17 April 2026 20:09:38 -0400 (0:00:02.813) 0:00:27.604 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 April 2026 20:09:39 -0400 (0:00:00.236) 0:00:27.840 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 April 2026 20:09:39 -0400 (0:00:00.130) 0:00:27.971 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 April 2026 20:09:39 -0400 (0:00:00.243) 0:00:28.215 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Friday 17 April 2026 20:09:39 -0400 (0:00:00.190) 0:00:28.405 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Friday 17 April 2026 20:09:43 -0400 (0:00:03.795) 0:00:32.200 ********** ok: [managed-node16] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Friday 17 April 2026 20:09:47 -0400 (0:00:03.673) 0:00:35.874 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Friday 17 April 2026 20:09:47 -0400 (0:00:00.349) 0:00:36.223 ********** ok: [managed-node16] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Friday 17 April 2026 20:09:49 -0400 (0:00:01.798) 0:00:38.022 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Friday 17 April 2026 20:09:49 -0400 (0:00:00.130) 0:00:38.153 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776470358.5319965, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "ab8070345774adad92683e9645714452be7be474", "ctime": 1776470357.050992, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 419430537, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776470357.050992, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1343, "uid": 0, "version": "571036750", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Friday 17 April 2026 20:09:50 -0400 (0:00:01.333) 0:00:39.486 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Friday 17 April 2026 20:09:50 -0400 (0:00:00.217) 0:00:39.703 ********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Friday 17 April 2026 20:09:51 -0400 (0:00:00.229) 0:00:39.933 ********** ok: [managed-node16] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Friday 17 April 2026 20:09:51 -0400 (0:00:00.194) 0:00:40.127 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Friday 17 April 2026 20:09:51 -0400 (0:00:00.105) 0:00:40.233 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Friday 17 April 2026 20:09:51 -0400 (0:00:00.058) 0:00:40.291 ********** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Friday 17 April 2026 20:09:51 -0400 (0:00:00.025) 0:00:40.316 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Friday 17 April 2026 20:09:51 -0400 (0:00:00.047) 0:00:40.364 ********** TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Friday 17 April 2026 20:09:51 -0400 (0:00:00.061) 0:00:40.425 ********** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Friday 17 April 2026 20:09:51 -0400 (0:00:00.036) 0:00:40.461 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Friday 17 April 2026 20:09:51 -0400 (0:00:00.050) 0:00:40.512 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776469722.1311164, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1716968941.893, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 135, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1716968586.525, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "1157759751", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Friday 17 April 2026 20:09:52 -0400 (0:00:00.665) 0:00:41.178 ********** TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Friday 17 April 2026 20:09:52 -0400 (0:00:00.055) 0:00:41.233 ********** ok: [managed-node16] TASK [Get unused disks] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:75 Friday 17 April 2026 20:09:53 -0400 (0:00:01.472) 0:00:42.705 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml for managed-node16 TASK [Ensure test packages] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:2 Friday 17 April 2026 20:09:54 -0400 (0:00:00.331) 0:00:43.037 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Find unused disks in the system] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:11 Friday 17 April 2026 20:09:57 -0400 (0:00:03.660) 0:00:46.697 ********** ok: [managed-node16] => { "changed": false, "disks": [ "sda" ], "info": [ "Line: NAME=\"/dev/sda\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdb\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdc\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdd\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sde\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdf\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdg\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdh\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdi\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/xvda\" TYPE=\"disk\" SIZE=\"268435456000\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/xvda1\" TYPE=\"part\" SIZE=\"268434390528\" FSTYPE=\"xfs\" LOG-SEC=\"512\"", "Line type [part] is not disk: NAME=\"/dev/xvda1\" TYPE=\"part\" SIZE=\"268434390528\" FSTYPE=\"xfs\" LOG-SEC=\"512\"", "filename [xvda1] is a partition", "Disk [/dev/xvda] attrs [{'type': 'disk', 'size': '268435456000', 'fstype': '', 'ssize': '512'}] has partitions" ] } TASK [Debug why there are no unused disks] ************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:20 Friday 17 April 2026 20:10:00 -0400 (0:00:02.709) 0:00:49.407 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:29 Friday 17 April 2026 20:10:00 -0400 (0:00:00.268) 0:00:49.676 ********** ok: [managed-node16] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:34 Friday 17 April 2026 20:10:01 -0400 (0:00:00.276) 0:00:49.952 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:39 Friday 17 April 2026 20:10:01 -0400 (0:00:00.270) 0:00:50.223 ********** ok: [managed-node16] => { "unused_disks": [ "sda" ] } TASK [Test for correct handling of new encrypted volume w/ no key] ************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:84 Friday 17 April 2026 20:10:01 -0400 (0:00:00.241) 0:00:50.464 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node16 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Friday 17 April 2026 20:10:02 -0400 (0:00:00.335) 0:00:50.800 ********** ok: [managed-node16] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Friday 17 April 2026 20:10:02 -0400 (0:00:00.247) 0:00:51.047 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node16 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Friday 17 April 2026 20:10:02 -0400 (0:00:00.370) 0:00:51.417 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Friday 17 April 2026 20:10:02 -0400 (0:00:00.096) 0:00:51.514 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 April 2026 20:10:02 -0400 (0:00:00.211) 0:00:51.725 ********** ok: [managed-node16] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Friday 17 April 2026 20:10:04 -0400 (0:00:01.906) 0:00:53.631 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 April 2026 20:10:05 -0400 (0:00:00.253) 0:00:53.885 ********** ok: [managed-node16] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 April 2026 20:10:07 -0400 (0:00:01.878) 0:00:55.764 ********** skipping: [managed-node16] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node16] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 April 2026 20:10:07 -0400 (0:00:00.520) 0:00:56.285 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 April 2026 20:10:07 -0400 (0:00:00.310) 0:00:56.595 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 April 2026 20:10:08 -0400 (0:00:00.144) 0:00:56.739 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 April 2026 20:10:08 -0400 (0:00:00.161) 0:00:56.901 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Friday 17 April 2026 20:10:08 -0400 (0:00:00.158) 0:00:57.059 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 April 2026 20:10:08 -0400 (0:00:00.622) 0:00:57.682 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Friday 17 April 2026 20:10:09 -0400 (0:00:00.152) 0:00:57.834 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Friday 17 April 2026 20:10:09 -0400 (0:00:00.133) 0:00:57.968 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Friday 17 April 2026 20:10:13 -0400 (0:00:04.066) 0:01:02.034 ********** ok: [managed-node16] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 17 April 2026 20:10:13 -0400 (0:00:00.197) 0:01:02.232 ********** ok: [managed-node16] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 April 2026 20:10:13 -0400 (0:00:00.172) 0:01:02.404 ********** ok: [managed-node16] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Friday 17 April 2026 20:10:18 -0400 (0:00:04.986) 0:01:07.391 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 April 2026 20:10:19 -0400 (0:00:00.393) 0:01:07.785 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 April 2026 20:10:19 -0400 (0:00:00.139) 0:01:07.924 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 April 2026 20:10:19 -0400 (0:00:00.169) 0:01:08.094 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Friday 17 April 2026 20:10:19 -0400 (0:00:00.230) 0:01:08.324 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Friday 17 April 2026 20:10:24 -0400 (0:00:04.617) 0:01:12.942 ********** ok: [managed-node16] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Friday 17 April 2026 20:10:27 -0400 (0:00:02.797) 0:01:15.739 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Friday 17 April 2026 20:10:27 -0400 (0:00:00.276) 0:01:16.016 ********** fatal: [managed-node16]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'foo' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Friday 17 April 2026 20:10:32 -0400 (0:00:05.546) 0:01:21.563 ********** fatal: [managed-node16]: FAILED! => { "changed": false } MSG: {'msg': "encrypted volume 'foo' missing key/password", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [], 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10737418240, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Friday 17 April 2026 20:10:33 -0400 (0:00:00.188) 0:01:21.752 ********** TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Friday 17 April 2026 20:10:33 -0400 (0:00:00.279) 0:01:22.032 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Friday 17 April 2026 20:10:33 -0400 (0:00:00.185) 0:01:22.218 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Friday 17 April 2026 20:10:33 -0400 (0:00:00.281) 0:01:22.500 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create an encrypted disk volume w/ default fs] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:99 Friday 17 April 2026 20:10:34 -0400 (0:00:00.241) 0:01:22.742 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node16 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Friday 17 April 2026 20:10:34 -0400 (0:00:00.319) 0:01:23.061 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Friday 17 April 2026 20:10:34 -0400 (0:00:00.148) 0:01:23.209 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 April 2026 20:10:34 -0400 (0:00:00.216) 0:01:23.426 ********** ok: [managed-node16] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Friday 17 April 2026 20:10:36 -0400 (0:00:01.768) 0:01:25.195 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 April 2026 20:10:36 -0400 (0:00:00.241) 0:01:25.436 ********** ok: [managed-node16] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 April 2026 20:10:38 -0400 (0:00:01.812) 0:01:27.249 ********** skipping: [managed-node16] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node16] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 April 2026 20:10:38 -0400 (0:00:00.369) 0:01:27.619 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 April 2026 20:10:39 -0400 (0:00:00.201) 0:01:27.821 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 April 2026 20:10:39 -0400 (0:00:00.176) 0:01:27.998 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 April 2026 20:10:39 -0400 (0:00:00.134) 0:01:28.132 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Friday 17 April 2026 20:10:39 -0400 (0:00:00.171) 0:01:28.304 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 April 2026 20:10:39 -0400 (0:00:00.325) 0:01:28.629 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Friday 17 April 2026 20:10:40 -0400 (0:00:00.202) 0:01:28.831 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Friday 17 April 2026 20:10:40 -0400 (0:00:00.188) 0:01:29.020 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Friday 17 April 2026 20:10:44 -0400 (0:00:03.985) 0:01:33.005 ********** ok: [managed-node16] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 17 April 2026 20:10:44 -0400 (0:00:00.214) 0:01:33.220 ********** ok: [managed-node16] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 April 2026 20:10:44 -0400 (0:00:00.182) 0:01:33.403 ********** ok: [managed-node16] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Friday 17 April 2026 20:10:49 -0400 (0:00:04.954) 0:01:38.357 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 April 2026 20:10:49 -0400 (0:00:00.361) 0:01:38.719 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 April 2026 20:10:50 -0400 (0:00:00.170) 0:01:38.890 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 April 2026 20:10:50 -0400 (0:00:00.252) 0:01:39.143 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Friday 17 April 2026 20:10:50 -0400 (0:00:00.173) 0:01:39.316 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Friday 17 April 2026 20:10:54 -0400 (0:00:04.365) 0:01:43.681 ********** ok: [managed-node16] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Friday 17 April 2026 20:10:57 -0400 (0:00:02.702) 0:01:46.384 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Friday 17 April 2026 20:10:58 -0400 (0:00:00.458) 0:01:46.842 ********** changed: [managed-node16] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-8b0f9973-6fca-4d87-b62f-c32a1cc12a52", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-8b0f9973-6fca-4d87-b62f-c32a1cc12a52", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-8b0f9973-6fca-4d87-b62f-c32a1cc12a52", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-8b0f9973-6fca-4d87-b62f-c32a1cc12a52" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-8b0f9973-6fca-4d87-b62f-c32a1cc12a52", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-8b0f9973-6fca-4d87-b62f-c32a1cc12a52", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-8b0f9973-6fca-4d87-b62f-c32a1cc12a52", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Friday 17 April 2026 20:11:11 -0400 (0:00:13.313) 0:02:00.156 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Friday 17 April 2026 20:11:11 -0400 (0:00:00.196) 0:02:00.353 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776470358.5319965, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "ab8070345774adad92683e9645714452be7be474", "ctime": 1776470357.050992, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 419430537, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776470357.050992, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1343, "uid": 0, "version": "571036750", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Friday 17 April 2026 20:11:13 -0400 (0:00:01.551) 0:02:01.904 ********** ok: [managed-node16] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Friday 17 April 2026 20:11:15 -0400 (0:00:02.340) 0:02:04.245 ********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Friday 17 April 2026 20:11:15 -0400 (0:00:00.282) 0:02:04.527 ********** ok: [managed-node16] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-8b0f9973-6fca-4d87-b62f-c32a1cc12a52", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-8b0f9973-6fca-4d87-b62f-c32a1cc12a52", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-8b0f9973-6fca-4d87-b62f-c32a1cc12a52", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-8b0f9973-6fca-4d87-b62f-c32a1cc12a52" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-8b0f9973-6fca-4d87-b62f-c32a1cc12a52", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-8b0f9973-6fca-4d87-b62f-c32a1cc12a52", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-8b0f9973-6fca-4d87-b62f-c32a1cc12a52", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Friday 17 April 2026 20:11:16 -0400 (0:00:00.323) 0:02:04.851 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Friday 17 April 2026 20:11:16 -0400 (0:00:00.253) 0:02:05.105 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-8b0f9973-6fca-4d87-b62f-c32a1cc12a52", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-8b0f9973-6fca-4d87-b62f-c32a1cc12a52", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Friday 17 April 2026 20:11:16 -0400 (0:00:00.268) 0:02:05.373 ********** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Friday 17 April 2026 20:11:16 -0400 (0:00:00.187) 0:02:05.561 ********** ok: [managed-node16] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Friday 17 April 2026 20:11:20 -0400 (0:00:04.015) 0:02:09.576 ********** changed: [managed-node16] => (item={'src': '/dev/mapper/luks-8b0f9973-6fca-4d87-b62f-c32a1cc12a52', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-8b0f9973-6fca-4d87-b62f-c32a1cc12a52", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-8b0f9973-6fca-4d87-b62f-c32a1cc12a52" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Friday 17 April 2026 20:11:23 -0400 (0:00:02.842) 0:02:12.419 ********** skipping: [managed-node16] => (item={'src': '/dev/mapper/luks-8b0f9973-6fca-4d87-b62f-c32a1cc12a52', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-8b0f9973-6fca-4d87-b62f-c32a1cc12a52", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Friday 17 April 2026 20:11:24 -0400 (0:00:00.355) 0:02:12.774 ********** ok: [managed-node16] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Friday 17 April 2026 20:11:25 -0400 (0:00:01.615) 0:02:14.389 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776469722.1311164, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1716968941.893, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 135, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1716968586.525, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "1157759751", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Friday 17 April 2026 20:11:27 -0400 (0:00:01.495) 0:02:15.885 ********** changed: [managed-node16] => (item={'backing_device': '/dev/sda', 'name': 'luks-8b0f9973-6fca-4d87-b62f-c32a1cc12a52', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-8b0f9973-6fca-4d87-b62f-c32a1cc12a52", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Friday 17 April 2026 20:11:28 -0400 (0:00:01.634) 0:02:17.519 ********** ok: [managed-node16] TASK [Verify role results] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:110 Friday 17 April 2026 20:11:30 -0400 (0:00:02.079) 0:02:19.599 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node16 TASK [Print out pool information] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Friday 17 April 2026 20:11:31 -0400 (0:00:00.412) 0:02:20.011 ********** skipping: [managed-node16] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Friday 17 April 2026 20:11:31 -0400 (0:00:00.233) 0:02:20.245 ********** ok: [managed-node16] => { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-8b0f9973-6fca-4d87-b62f-c32a1cc12a52", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-8b0f9973-6fca-4d87-b62f-c32a1cc12a52", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Friday 17 April 2026 20:11:31 -0400 (0:00:00.284) 0:02:20.529 ********** ok: [managed-node16] => { "changed": false, "info": { "/dev/mapper/luks-8b0f9973-6fca-4d87-b62f-c32a1cc12a52": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-8b0f9973-6fca-4d87-b62f-c32a1cc12a52", "size": "10G", "type": "crypt", "uuid": "80e1d472-ccb1-4883-8767-fb4c91ed93ac" }, "/dev/sda": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "8b0f9973-6fca-4d87-b62f-c32a1cc12a52" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Friday 17 April 2026 20:11:34 -0400 (0:00:02.452) 0:02:22.982 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002307", "end": "2026-04-17 20:11:36.596157", "rc": 0, "start": "2026-04-17 20:11:36.593850" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-8b0f9973-6fca-4d87-b62f-c32a1cc12a52 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Friday 17 April 2026 20:11:36 -0400 (0:00:02.519) 0:02:25.501 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002392", "end": "2026-04-17 20:11:37.847955", "failed_when_result": false, "rc": 0, "start": "2026-04-17 20:11:37.845563" } STDOUT: luks-8b0f9973-6fca-4d87-b62f-c32a1cc12a52 /dev/sda - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Friday 17 April 2026 20:11:38 -0400 (0:00:01.378) 0:02:26.880 ********** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Friday 17 April 2026 20:11:38 -0400 (0:00:00.171) 0:02:27.051 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node16 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Friday 17 April 2026 20:11:38 -0400 (0:00:00.334) 0:02:27.386 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Friday 17 April 2026 20:11:38 -0400 (0:00:00.204) 0:02:27.590 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node16 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Friday 17 April 2026 20:11:40 -0400 (0:00:01.225) 0:02:28.816 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-8b0f9973-6fca-4d87-b62f-c32a1cc12a52" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Friday 17 April 2026 20:11:40 -0400 (0:00:00.346) 0:02:29.162 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Friday 17 April 2026 20:11:40 -0400 (0:00:00.290) 0:02:29.453 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Friday 17 April 2026 20:11:41 -0400 (0:00:00.390) 0:02:29.843 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Friday 17 April 2026 20:11:41 -0400 (0:00:00.261) 0:02:30.105 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Friday 17 April 2026 20:11:41 -0400 (0:00:00.212) 0:02:30.317 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Friday 17 April 2026 20:11:41 -0400 (0:00:00.240) 0:02:30.558 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Friday 17 April 2026 20:11:42 -0400 (0:00:00.366) 0:02:30.924 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Friday 17 April 2026 20:11:42 -0400 (0:00:00.256) 0:02:31.181 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Friday 17 April 2026 20:11:42 -0400 (0:00:00.226) 0:02:31.408 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Friday 17 April 2026 20:11:42 -0400 (0:00:00.201) 0:02:31.609 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Friday 17 April 2026 20:11:43 -0400 (0:00:00.137) 0:02:31.746 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-8b0f9973-6fca-4d87-b62f-c32a1cc12a52 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Friday 17 April 2026 20:11:43 -0400 (0:00:00.529) 0:02:32.276 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Friday 17 April 2026 20:11:43 -0400 (0:00:00.208) 0:02:32.485 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Friday 17 April 2026 20:11:44 -0400 (0:00:00.243) 0:02:32.729 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Friday 17 April 2026 20:11:44 -0400 (0:00:00.329) 0:02:33.058 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Friday 17 April 2026 20:11:44 -0400 (0:00:00.420) 0:02:33.479 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Friday 17 April 2026 20:11:44 -0400 (0:00:00.129) 0:02:33.608 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Friday 17 April 2026 20:11:45 -0400 (0:00:00.793) 0:02:34.402 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Friday 17 April 2026 20:11:46 -0400 (0:00:00.421) 0:02:34.823 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776471070.9981825, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776471070.9981825, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 36719, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1776471070.9981825, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Friday 17 April 2026 20:11:47 -0400 (0:00:01.719) 0:02:36.542 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Friday 17 April 2026 20:11:48 -0400 (0:00:00.231) 0:02:36.773 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Friday 17 April 2026 20:11:48 -0400 (0:00:00.261) 0:02:37.035 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Friday 17 April 2026 20:11:48 -0400 (0:00:00.302) 0:02:37.338 ********** ok: [managed-node16] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Friday 17 April 2026 20:11:48 -0400 (0:00:00.166) 0:02:37.505 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Friday 17 April 2026 20:11:48 -0400 (0:00:00.200) 0:02:37.705 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Friday 17 April 2026 20:11:49 -0400 (0:00:00.202) 0:02:37.908 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776471071.1411831, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776471071.1411831, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 172652, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776471071.1411831, "nlink": 1, "path": "/dev/mapper/luks-8b0f9973-6fca-4d87-b62f-c32a1cc12a52", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Friday 17 April 2026 20:11:50 -0400 (0:00:01.683) 0:02:39.591 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Friday 17 April 2026 20:11:55 -0400 (0:00:04.475) 0:02:44.067 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda" ], "delta": "0:00:00.011600", "end": "2026-04-17 20:11:56.900292", "rc": 0, "start": "2026-04-17 20:11:56.888692" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 8b0f9973-6fca-4d87-b62f-c32a1cc12a52 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 923141 Threads: 2 Salt: 0c 1b f1 98 6b ec a3 a3 22 91 4b f1 96 10 21 71 b1 2f 86 97 ff fe 46 96 c9 b0 89 be cf 55 0c dd AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 120249 Salt: c9 6d 50 55 19 b3 23 1f 8f 79 64 44 cc 46 23 2f d7 f5 20 cf 3c 78 7c 47 81 cf be 37 37 1a 96 53 Digest: 9e 06 6b 3a 83 a7 7b c1 18 be 48 7e d6 c1 42 cf b9 38 0a ac 26 c1 d6 75 ea 5f 7c 02 13 45 ac 94 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Friday 17 April 2026 20:11:57 -0400 (0:00:01.816) 0:02:45.883 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Friday 17 April 2026 20:11:57 -0400 (0:00:00.360) 0:02:46.244 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Friday 17 April 2026 20:11:57 -0400 (0:00:00.361) 0:02:46.605 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Friday 17 April 2026 20:11:58 -0400 (0:00:00.323) 0:02:46.929 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Friday 17 April 2026 20:11:58 -0400 (0:00:00.231) 0:02:47.160 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Friday 17 April 2026 20:11:58 -0400 (0:00:00.244) 0:02:47.404 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Friday 17 April 2026 20:11:58 -0400 (0:00:00.220) 0:02:47.625 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Friday 17 April 2026 20:11:59 -0400 (0:00:00.224) 0:02:47.849 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-8b0f9973-6fca-4d87-b62f-c32a1cc12a52 /dev/sda -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Friday 17 April 2026 20:11:59 -0400 (0:00:00.235) 0:02:48.085 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Friday 17 April 2026 20:11:59 -0400 (0:00:00.263) 0:02:48.349 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Friday 17 April 2026 20:11:59 -0400 (0:00:00.277) 0:02:48.626 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Friday 17 April 2026 20:12:00 -0400 (0:00:00.246) 0:02:48.873 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Friday 17 April 2026 20:12:00 -0400 (0:00:00.296) 0:02:49.170 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Friday 17 April 2026 20:12:00 -0400 (0:00:00.259) 0:02:49.429 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Friday 17 April 2026 20:12:00 -0400 (0:00:00.193) 0:02:49.623 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Friday 17 April 2026 20:12:01 -0400 (0:00:00.237) 0:02:49.860 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Friday 17 April 2026 20:12:01 -0400 (0:00:00.283) 0:02:50.143 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Friday 17 April 2026 20:12:01 -0400 (0:00:00.236) 0:02:50.380 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Friday 17 April 2026 20:12:01 -0400 (0:00:00.237) 0:02:50.618 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Friday 17 April 2026 20:12:02 -0400 (0:00:00.207) 0:02:50.825 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Friday 17 April 2026 20:12:02 -0400 (0:00:00.281) 0:02:51.107 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Friday 17 April 2026 20:12:02 -0400 (0:00:00.303) 0:02:51.411 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Friday 17 April 2026 20:12:02 -0400 (0:00:00.277) 0:02:51.689 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Friday 17 April 2026 20:12:03 -0400 (0:00:00.257) 0:02:51.946 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Friday 17 April 2026 20:12:03 -0400 (0:00:00.289) 0:02:52.235 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Friday 17 April 2026 20:12:03 -0400 (0:00:00.381) 0:02:52.616 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Friday 17 April 2026 20:12:04 -0400 (0:00:00.170) 0:02:52.786 ********** ok: [managed-node16] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Friday 17 April 2026 20:12:04 -0400 (0:00:00.218) 0:02:53.005 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Friday 17 April 2026 20:12:04 -0400 (0:00:00.202) 0:02:53.208 ********** skipping: [managed-node16] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Friday 17 April 2026 20:12:04 -0400 (0:00:00.206) 0:02:53.415 ********** skipping: [managed-node16] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Friday 17 April 2026 20:12:04 -0400 (0:00:00.203) 0:02:53.618 ********** skipping: [managed-node16] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Friday 17 April 2026 20:12:05 -0400 (0:00:00.248) 0:02:53.866 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Friday 17 April 2026 20:12:05 -0400 (0:00:00.232) 0:02:54.099 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Friday 17 April 2026 20:12:05 -0400 (0:00:00.162) 0:02:54.261 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Friday 17 April 2026 20:12:05 -0400 (0:00:00.240) 0:02:54.502 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Friday 17 April 2026 20:12:06 -0400 (0:00:00.330) 0:02:54.832 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Friday 17 April 2026 20:12:06 -0400 (0:00:00.161) 0:02:54.993 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Friday 17 April 2026 20:12:06 -0400 (0:00:00.195) 0:02:55.189 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Friday 17 April 2026 20:12:06 -0400 (0:00:00.180) 0:02:55.370 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Friday 17 April 2026 20:12:06 -0400 (0:00:00.269) 0:02:55.639 ********** skipping: [managed-node16] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Friday 17 April 2026 20:12:07 -0400 (0:00:00.291) 0:02:55.931 ********** skipping: [managed-node16] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Friday 17 April 2026 20:12:07 -0400 (0:00:00.214) 0:02:56.146 ********** skipping: [managed-node16] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Friday 17 April 2026 20:12:07 -0400 (0:00:00.191) 0:02:56.337 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Friday 17 April 2026 20:12:07 -0400 (0:00:00.279) 0:02:56.617 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Friday 17 April 2026 20:12:08 -0400 (0:00:00.312) 0:02:56.929 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Friday 17 April 2026 20:12:08 -0400 (0:00:00.196) 0:02:57.126 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Friday 17 April 2026 20:12:08 -0400 (0:00:00.285) 0:02:57.411 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Friday 17 April 2026 20:12:08 -0400 (0:00:00.235) 0:02:57.647 ********** ok: [managed-node16] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Friday 17 April 2026 20:12:09 -0400 (0:00:00.186) 0:02:57.833 ********** ok: [managed-node16] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Friday 17 April 2026 20:12:09 -0400 (0:00:00.221) 0:02:58.054 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Friday 17 April 2026 20:12:09 -0400 (0:00:00.194) 0:02:58.249 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Friday 17 April 2026 20:12:09 -0400 (0:00:00.247) 0:02:58.496 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Friday 17 April 2026 20:12:09 -0400 (0:00:00.186) 0:02:58.683 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Friday 17 April 2026 20:12:10 -0400 (0:00:00.249) 0:02:58.933 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Friday 17 April 2026 20:12:10 -0400 (0:00:00.167) 0:02:59.101 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Friday 17 April 2026 20:12:10 -0400 (0:00:00.180) 0:02:59.282 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Friday 17 April 2026 20:12:10 -0400 (0:00:00.139) 0:02:59.421 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Friday 17 April 2026 20:12:10 -0400 (0:00:00.258) 0:02:59.680 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Friday 17 April 2026 20:12:11 -0400 (0:00:00.164) 0:02:59.845 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Friday 17 April 2026 20:12:11 -0400 (0:00:00.195) 0:03:00.040 ********** changed: [managed-node16] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:116 Friday 17 April 2026 20:12:13 -0400 (0:00:02.535) 0:03:02.576 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node16 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Friday 17 April 2026 20:12:14 -0400 (0:00:00.247) 0:03:02.823 ********** ok: [managed-node16] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Friday 17 April 2026 20:12:14 -0400 (0:00:00.213) 0:03:03.037 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node16 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Friday 17 April 2026 20:12:14 -0400 (0:00:00.184) 0:03:03.221 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Friday 17 April 2026 20:12:14 -0400 (0:00:00.126) 0:03:03.348 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 April 2026 20:12:14 -0400 (0:00:00.310) 0:03:03.658 ********** ok: [managed-node16] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Friday 17 April 2026 20:12:16 -0400 (0:00:01.879) 0:03:05.538 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 April 2026 20:12:17 -0400 (0:00:00.213) 0:03:05.751 ********** ok: [managed-node16] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 April 2026 20:12:18 -0400 (0:00:01.785) 0:03:07.537 ********** skipping: [managed-node16] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node16] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 April 2026 20:12:19 -0400 (0:00:00.427) 0:03:07.965 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 April 2026 20:12:19 -0400 (0:00:00.228) 0:03:08.194 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 April 2026 20:12:19 -0400 (0:00:00.212) 0:03:08.406 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 April 2026 20:12:19 -0400 (0:00:00.177) 0:03:08.584 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Friday 17 April 2026 20:12:20 -0400 (0:00:00.217) 0:03:08.801 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 April 2026 20:12:20 -0400 (0:00:00.361) 0:03:09.162 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Friday 17 April 2026 20:12:20 -0400 (0:00:00.140) 0:03:09.302 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Friday 17 April 2026 20:12:21 -0400 (0:00:00.468) 0:03:09.770 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Friday 17 April 2026 20:12:24 -0400 (0:00:03.726) 0:03:13.497 ********** ok: [managed-node16] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 17 April 2026 20:12:24 -0400 (0:00:00.170) 0:03:13.667 ********** ok: [managed-node16] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 April 2026 20:12:25 -0400 (0:00:00.123) 0:03:13.791 ********** ok: [managed-node16] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Friday 17 April 2026 20:12:30 -0400 (0:00:05.019) 0:03:18.810 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 April 2026 20:12:30 -0400 (0:00:00.152) 0:03:18.963 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 April 2026 20:12:30 -0400 (0:00:00.136) 0:03:19.100 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 April 2026 20:12:30 -0400 (0:00:00.152) 0:03:19.253 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Friday 17 April 2026 20:12:30 -0400 (0:00:00.150) 0:03:19.403 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Friday 17 April 2026 20:12:35 -0400 (0:00:04.336) 0:03:23.739 ********** ok: [managed-node16] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Friday 17 April 2026 20:12:37 -0400 (0:00:02.954) 0:03:26.694 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Friday 17 April 2026 20:12:38 -0400 (0:00:00.269) 0:03:26.963 ********** fatal: [managed-node16]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-8b0f9973-6fca-4d87-b62f-c32a1cc12a52' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Friday 17 April 2026 20:12:43 -0400 (0:00:05.480) 0:03:32.444 ********** fatal: [managed-node16]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'luks-8b0f9973-6fca-4d87-b62f-c32a1cc12a52' in safe mode due to encryption removal", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [], 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10720641024, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Friday 17 April 2026 20:12:44 -0400 (0:00:00.296) 0:03:32.741 ********** TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Friday 17 April 2026 20:12:44 -0400 (0:00:00.213) 0:03:32.955 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Friday 17 April 2026 20:12:44 -0400 (0:00:00.241) 0:03:33.196 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Friday 17 April 2026 20:12:44 -0400 (0:00:00.267) 0:03:33.464 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Friday 17 April 2026 20:12:44 -0400 (0:00:00.200) 0:03:33.665 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776471133.6573746, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1776471133.6573746, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1776471133.6573746, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "457506365", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Friday 17 April 2026 20:12:46 -0400 (0:00:01.203) 0:03:34.868 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:136 Friday 17 April 2026 20:12:46 -0400 (0:00:00.198) 0:03:35.067 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node16 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Friday 17 April 2026 20:12:46 -0400 (0:00:00.255) 0:03:35.323 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Friday 17 April 2026 20:12:46 -0400 (0:00:00.390) 0:03:35.713 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 April 2026 20:12:47 -0400 (0:00:00.216) 0:03:35.930 ********** ok: [managed-node16] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Friday 17 April 2026 20:12:48 -0400 (0:00:01.544) 0:03:37.474 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 April 2026 20:12:48 -0400 (0:00:00.188) 0:03:37.662 ********** ok: [managed-node16] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 April 2026 20:12:50 -0400 (0:00:01.831) 0:03:39.493 ********** skipping: [managed-node16] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node16] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 April 2026 20:12:51 -0400 (0:00:00.426) 0:03:39.919 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 April 2026 20:12:51 -0400 (0:00:00.164) 0:03:40.084 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 April 2026 20:12:51 -0400 (0:00:00.160) 0:03:40.245 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 April 2026 20:12:51 -0400 (0:00:00.158) 0:03:40.403 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Friday 17 April 2026 20:12:51 -0400 (0:00:00.188) 0:03:40.592 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 April 2026 20:12:52 -0400 (0:00:00.310) 0:03:40.902 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Friday 17 April 2026 20:12:52 -0400 (0:00:00.142) 0:03:41.045 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Friday 17 April 2026 20:12:52 -0400 (0:00:00.198) 0:03:41.243 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Friday 17 April 2026 20:12:56 -0400 (0:00:04.118) 0:03:45.361 ********** ok: [managed-node16] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 17 April 2026 20:12:56 -0400 (0:00:00.188) 0:03:45.550 ********** ok: [managed-node16] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 April 2026 20:12:57 -0400 (0:00:00.182) 0:03:45.733 ********** ok: [managed-node16] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Friday 17 April 2026 20:13:02 -0400 (0:00:05.048) 0:03:50.781 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 April 2026 20:13:02 -0400 (0:00:00.230) 0:03:51.012 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 April 2026 20:13:02 -0400 (0:00:00.170) 0:03:51.182 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 April 2026 20:13:02 -0400 (0:00:00.235) 0:03:51.417 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Friday 17 April 2026 20:13:02 -0400 (0:00:00.157) 0:03:51.574 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Friday 17 April 2026 20:13:07 -0400 (0:00:04.185) 0:03:55.759 ********** ok: [managed-node16] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Friday 17 April 2026 20:13:09 -0400 (0:00:02.369) 0:03:58.129 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Friday 17 April 2026 20:13:09 -0400 (0:00:00.285) 0:03:58.414 ********** changed: [managed-node16] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-8b0f9973-6fca-4d87-b62f-c32a1cc12a52", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-8b0f9973-6fca-4d87-b62f-c32a1cc12a52", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-8b0f9973-6fca-4d87-b62f-c32a1cc12a52", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-8b0f9973-6fca-4d87-b62f-c32a1cc12a52", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=0866b469-2e99-4871-99e5-9d1926d1d10c", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=0866b469-2e99-4871-99e5-9d1926d1d10c", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Friday 17 April 2026 20:13:15 -0400 (0:00:05.368) 0:04:03.783 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Friday 17 April 2026 20:13:15 -0400 (0:00:00.272) 0:04:04.055 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776471083.4172206, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "6ac440590b7acdd2acb326ebd9bcff7af1178e14", "ctime": 1776471083.4142206, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 419430537, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776471083.4142206, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "571036750", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Friday 17 April 2026 20:13:17 -0400 (0:00:01.828) 0:04:05.883 ********** ok: [managed-node16] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Friday 17 April 2026 20:13:18 -0400 (0:00:01.565) 0:04:07.449 ********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Friday 17 April 2026 20:13:19 -0400 (0:00:00.346) 0:04:07.795 ********** ok: [managed-node16] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-8b0f9973-6fca-4d87-b62f-c32a1cc12a52", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-8b0f9973-6fca-4d87-b62f-c32a1cc12a52", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-8b0f9973-6fca-4d87-b62f-c32a1cc12a52", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-8b0f9973-6fca-4d87-b62f-c32a1cc12a52", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=0866b469-2e99-4871-99e5-9d1926d1d10c", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=0866b469-2e99-4871-99e5-9d1926d1d10c", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Friday 17 April 2026 20:13:19 -0400 (0:00:00.200) 0:04:07.996 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Friday 17 April 2026 20:13:19 -0400 (0:00:00.275) 0:04:08.271 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=0866b469-2e99-4871-99e5-9d1926d1d10c", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Friday 17 April 2026 20:13:19 -0400 (0:00:00.268) 0:04:08.540 ********** changed: [managed-node16] => (item={'src': '/dev/mapper/luks-8b0f9973-6fca-4d87-b62f-c32a1cc12a52', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-8b0f9973-6fca-4d87-b62f-c32a1cc12a52", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-8b0f9973-6fca-4d87-b62f-c32a1cc12a52" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Friday 17 April 2026 20:13:21 -0400 (0:00:01.587) 0:04:10.128 ********** ok: [managed-node16] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Friday 17 April 2026 20:13:23 -0400 (0:00:02.027) 0:04:12.155 ********** changed: [managed-node16] => (item={'src': 'UUID=0866b469-2e99-4871-99e5-9d1926d1d10c', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=0866b469-2e99-4871-99e5-9d1926d1d10c", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=0866b469-2e99-4871-99e5-9d1926d1d10c" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Friday 17 April 2026 20:13:24 -0400 (0:00:01.371) 0:04:13.526 ********** skipping: [managed-node16] => (item={'src': 'UUID=0866b469-2e99-4871-99e5-9d1926d1d10c', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=0866b469-2e99-4871-99e5-9d1926d1d10c", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Friday 17 April 2026 20:13:25 -0400 (0:00:00.264) 0:04:13.791 ********** ok: [managed-node16] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Friday 17 April 2026 20:13:26 -0400 (0:00:01.752) 0:04:15.543 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776471097.8472648, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "9d13ca600ac6bf648f1b2ebb9f968a48a5027391", "ctime": 1776471088.437236, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 442499214, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1776471088.4362361, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 53, "uid": 0, "version": "995765689", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Friday 17 April 2026 20:13:28 -0400 (0:00:01.489) 0:04:17.033 ********** changed: [managed-node16] => (item={'backing_device': '/dev/sda', 'name': 'luks-8b0f9973-6fca-4d87-b62f-c32a1cc12a52', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-8b0f9973-6fca-4d87-b62f-c32a1cc12a52", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Friday 17 April 2026 20:13:30 -0400 (0:00:01.896) 0:04:18.930 ********** ok: [managed-node16] TASK [Verify role results - 2] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:148 Friday 17 April 2026 20:13:32 -0400 (0:00:02.261) 0:04:21.191 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node16 TASK [Print out pool information] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Friday 17 April 2026 20:13:32 -0400 (0:00:00.499) 0:04:21.691 ********** skipping: [managed-node16] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Friday 17 April 2026 20:13:33 -0400 (0:00:00.249) 0:04:21.940 ********** ok: [managed-node16] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=0866b469-2e99-4871-99e5-9d1926d1d10c", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Friday 17 April 2026 20:13:33 -0400 (0:00:00.254) 0:04:22.194 ********** ok: [managed-node16] => { "changed": false, "info": { "/dev/sda": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "0866b469-2e99-4871-99e5-9d1926d1d10c" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Friday 17 April 2026 20:13:35 -0400 (0:00:01.626) 0:04:23.820 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002202", "end": "2026-04-17 20:13:36.319393", "rc": 0, "start": "2026-04-17 20:13:36.317191" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 UUID=0866b469-2e99-4871-99e5-9d1926d1d10c /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Friday 17 April 2026 20:13:36 -0400 (0:00:01.547) 0:04:25.367 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002271", "end": "2026-04-17 20:13:37.995226", "failed_when_result": false, "rc": 0, "start": "2026-04-17 20:13:37.992955" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Friday 17 April 2026 20:13:38 -0400 (0:00:01.643) 0:04:27.011 ********** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Friday 17 April 2026 20:13:38 -0400 (0:00:00.188) 0:04:27.200 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node16 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Friday 17 April 2026 20:13:38 -0400 (0:00:00.470) 0:04:27.670 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Friday 17 April 2026 20:13:39 -0400 (0:00:00.318) 0:04:27.988 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node16 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Friday 17 April 2026 20:13:40 -0400 (0:00:01.182) 0:04:29.171 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Friday 17 April 2026 20:13:40 -0400 (0:00:00.243) 0:04:29.415 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Friday 17 April 2026 20:13:40 -0400 (0:00:00.207) 0:04:29.622 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Friday 17 April 2026 20:13:41 -0400 (0:00:00.237) 0:04:29.860 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Friday 17 April 2026 20:13:41 -0400 (0:00:00.160) 0:04:30.020 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Friday 17 April 2026 20:13:41 -0400 (0:00:00.174) 0:04:30.195 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Friday 17 April 2026 20:13:41 -0400 (0:00:00.163) 0:04:30.358 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Friday 17 April 2026 20:13:41 -0400 (0:00:00.172) 0:04:30.530 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Friday 17 April 2026 20:13:42 -0400 (0:00:00.200) 0:04:30.731 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Friday 17 April 2026 20:13:42 -0400 (0:00:00.195) 0:04:30.926 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Friday 17 April 2026 20:13:42 -0400 (0:00:00.247) 0:04:31.174 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Friday 17 April 2026 20:13:42 -0400 (0:00:00.176) 0:04:31.350 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=0866b469-2e99-4871-99e5-9d1926d1d10c " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Friday 17 April 2026 20:13:42 -0400 (0:00:00.344) 0:04:31.695 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Friday 17 April 2026 20:13:43 -0400 (0:00:00.259) 0:04:31.954 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Friday 17 April 2026 20:13:43 -0400 (0:00:00.260) 0:04:32.214 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Friday 17 April 2026 20:13:43 -0400 (0:00:00.158) 0:04:32.373 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Friday 17 April 2026 20:13:43 -0400 (0:00:00.191) 0:04:32.564 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Friday 17 April 2026 20:13:43 -0400 (0:00:00.138) 0:04:32.703 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Friday 17 April 2026 20:13:44 -0400 (0:00:00.275) 0:04:32.978 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Friday 17 April 2026 20:13:44 -0400 (0:00:00.187) 0:04:33.165 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776471194.7505617, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776471194.7505617, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 36719, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1776471194.7505617, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Friday 17 April 2026 20:13:45 -0400 (0:00:00.870) 0:04:34.035 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Friday 17 April 2026 20:13:45 -0400 (0:00:00.375) 0:04:34.411 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Friday 17 April 2026 20:13:45 -0400 (0:00:00.300) 0:04:34.712 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Friday 17 April 2026 20:13:46 -0400 (0:00:00.226) 0:04:34.938 ********** ok: [managed-node16] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Friday 17 April 2026 20:13:46 -0400 (0:00:00.218) 0:04:35.156 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Friday 17 April 2026 20:13:47 -0400 (0:00:00.604) 0:04:35.761 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Friday 17 April 2026 20:13:47 -0400 (0:00:00.158) 0:04:35.919 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Friday 17 April 2026 20:13:47 -0400 (0:00:00.235) 0:04:36.154 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Friday 17 April 2026 20:13:51 -0400 (0:00:04.385) 0:04:40.540 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Friday 17 April 2026 20:13:52 -0400 (0:00:00.227) 0:04:40.767 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Friday 17 April 2026 20:13:52 -0400 (0:00:00.163) 0:04:40.931 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Friday 17 April 2026 20:13:52 -0400 (0:00:00.257) 0:04:41.189 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Friday 17 April 2026 20:13:52 -0400 (0:00:00.225) 0:04:41.414 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Friday 17 April 2026 20:13:52 -0400 (0:00:00.252) 0:04:41.667 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Friday 17 April 2026 20:13:53 -0400 (0:00:00.205) 0:04:41.872 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Friday 17 April 2026 20:13:53 -0400 (0:00:00.296) 0:04:42.168 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Friday 17 April 2026 20:13:53 -0400 (0:00:00.296) 0:04:42.465 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Friday 17 April 2026 20:13:54 -0400 (0:00:00.437) 0:04:42.902 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Friday 17 April 2026 20:13:54 -0400 (0:00:00.281) 0:04:43.184 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Friday 17 April 2026 20:13:54 -0400 (0:00:00.316) 0:04:43.500 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Friday 17 April 2026 20:13:55 -0400 (0:00:00.271) 0:04:43.771 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Friday 17 April 2026 20:13:55 -0400 (0:00:00.251) 0:04:44.022 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Friday 17 April 2026 20:13:55 -0400 (0:00:00.255) 0:04:44.278 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Friday 17 April 2026 20:13:55 -0400 (0:00:00.307) 0:04:44.586 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Friday 17 April 2026 20:13:56 -0400 (0:00:00.249) 0:04:44.836 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Friday 17 April 2026 20:13:56 -0400 (0:00:00.166) 0:04:45.003 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Friday 17 April 2026 20:13:56 -0400 (0:00:00.243) 0:04:45.247 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Friday 17 April 2026 20:13:56 -0400 (0:00:00.191) 0:04:45.438 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Friday 17 April 2026 20:13:56 -0400 (0:00:00.200) 0:04:45.639 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Friday 17 April 2026 20:13:57 -0400 (0:00:00.144) 0:04:45.783 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Friday 17 April 2026 20:13:57 -0400 (0:00:00.245) 0:04:46.029 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Friday 17 April 2026 20:13:57 -0400 (0:00:00.317) 0:04:46.346 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Friday 17 April 2026 20:13:57 -0400 (0:00:00.273) 0:04:46.620 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Friday 17 April 2026 20:13:58 -0400 (0:00:00.264) 0:04:46.885 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Friday 17 April 2026 20:13:58 -0400 (0:00:00.255) 0:04:47.141 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Friday 17 April 2026 20:13:58 -0400 (0:00:00.322) 0:04:47.463 ********** ok: [managed-node16] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Friday 17 April 2026 20:13:59 -0400 (0:00:00.348) 0:04:47.813 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Friday 17 April 2026 20:13:59 -0400 (0:00:00.361) 0:04:48.174 ********** skipping: [managed-node16] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Friday 17 April 2026 20:13:59 -0400 (0:00:00.169) 0:04:48.344 ********** skipping: [managed-node16] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Friday 17 April 2026 20:13:59 -0400 (0:00:00.228) 0:04:48.572 ********** skipping: [managed-node16] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Friday 17 April 2026 20:14:00 -0400 (0:00:00.165) 0:04:48.738 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Friday 17 April 2026 20:14:00 -0400 (0:00:00.223) 0:04:48.962 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Friday 17 April 2026 20:14:00 -0400 (0:00:00.194) 0:04:49.157 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Friday 17 April 2026 20:14:00 -0400 (0:00:00.151) 0:04:49.308 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Friday 17 April 2026 20:14:00 -0400 (0:00:00.158) 0:04:49.466 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Friday 17 April 2026 20:14:00 -0400 (0:00:00.155) 0:04:49.621 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Friday 17 April 2026 20:14:01 -0400 (0:00:00.241) 0:04:49.863 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Friday 17 April 2026 20:14:01 -0400 (0:00:00.322) 0:04:50.186 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Friday 17 April 2026 20:14:01 -0400 (0:00:00.309) 0:04:50.496 ********** skipping: [managed-node16] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Friday 17 April 2026 20:14:02 -0400 (0:00:00.298) 0:04:50.795 ********** skipping: [managed-node16] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Friday 17 April 2026 20:14:02 -0400 (0:00:00.210) 0:04:51.006 ********** skipping: [managed-node16] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Friday 17 April 2026 20:14:02 -0400 (0:00:00.258) 0:04:51.264 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Friday 17 April 2026 20:14:02 -0400 (0:00:00.306) 0:04:51.571 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Friday 17 April 2026 20:14:03 -0400 (0:00:00.267) 0:04:51.838 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Friday 17 April 2026 20:14:03 -0400 (0:00:00.168) 0:04:52.007 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Friday 17 April 2026 20:14:03 -0400 (0:00:00.249) 0:04:52.256 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Friday 17 April 2026 20:14:03 -0400 (0:00:00.242) 0:04:52.498 ********** ok: [managed-node16] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Friday 17 April 2026 20:14:03 -0400 (0:00:00.199) 0:04:52.698 ********** ok: [managed-node16] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Friday 17 April 2026 20:14:04 -0400 (0:00:00.237) 0:04:52.935 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Friday 17 April 2026 20:14:04 -0400 (0:00:00.144) 0:04:53.079 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Friday 17 April 2026 20:14:04 -0400 (0:00:00.255) 0:04:53.335 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Friday 17 April 2026 20:14:04 -0400 (0:00:00.219) 0:04:53.554 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Friday 17 April 2026 20:14:05 -0400 (0:00:00.173) 0:04:53.728 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Friday 17 April 2026 20:14:05 -0400 (0:00:00.152) 0:04:53.880 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Friday 17 April 2026 20:14:05 -0400 (0:00:00.164) 0:04:54.045 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Friday 17 April 2026 20:14:05 -0400 (0:00:00.172) 0:04:54.218 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Friday 17 April 2026 20:14:05 -0400 (0:00:00.186) 0:04:54.405 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Friday 17 April 2026 20:14:05 -0400 (0:00:00.163) 0:04:54.569 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Friday 17 April 2026 20:14:05 -0400 (0:00:00.145) 0:04:54.714 ********** changed: [managed-node16] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 2] ****************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:154 Friday 17 April 2026 20:14:07 -0400 (0:00:01.699) 0:04:56.414 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node16 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Friday 17 April 2026 20:14:08 -0400 (0:00:00.485) 0:04:56.900 ********** ok: [managed-node16] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Friday 17 April 2026 20:14:08 -0400 (0:00:00.214) 0:04:57.114 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node16 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Friday 17 April 2026 20:14:08 -0400 (0:00:00.317) 0:04:57.432 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Friday 17 April 2026 20:14:08 -0400 (0:00:00.153) 0:04:57.586 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 April 2026 20:14:09 -0400 (0:00:00.332) 0:04:57.919 ********** ok: [managed-node16] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Friday 17 April 2026 20:14:10 -0400 (0:00:01.794) 0:04:59.713 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 April 2026 20:14:11 -0400 (0:00:00.238) 0:04:59.952 ********** ok: [managed-node16] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 April 2026 20:14:13 -0400 (0:00:02.149) 0:05:02.101 ********** skipping: [managed-node16] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node16] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 April 2026 20:14:13 -0400 (0:00:00.434) 0:05:02.536 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 April 2026 20:14:14 -0400 (0:00:00.206) 0:05:02.742 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 April 2026 20:14:14 -0400 (0:00:00.275) 0:05:03.017 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 April 2026 20:14:14 -0400 (0:00:00.176) 0:05:03.194 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Friday 17 April 2026 20:14:14 -0400 (0:00:00.140) 0:05:03.335 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 April 2026 20:14:15 -0400 (0:00:00.449) 0:05:03.784 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Friday 17 April 2026 20:14:15 -0400 (0:00:00.169) 0:05:03.953 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Friday 17 April 2026 20:14:15 -0400 (0:00:00.202) 0:05:04.156 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Friday 17 April 2026 20:14:19 -0400 (0:00:03.984) 0:05:08.140 ********** ok: [managed-node16] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 17 April 2026 20:14:19 -0400 (0:00:00.203) 0:05:08.344 ********** ok: [managed-node16] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 April 2026 20:14:19 -0400 (0:00:00.175) 0:05:08.519 ********** ok: [managed-node16] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Friday 17 April 2026 20:14:24 -0400 (0:00:05.095) 0:05:13.614 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 April 2026 20:14:25 -0400 (0:00:00.121) 0:05:13.735 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 April 2026 20:14:25 -0400 (0:00:00.122) 0:05:13.858 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 April 2026 20:14:25 -0400 (0:00:00.096) 0:05:13.954 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Friday 17 April 2026 20:14:25 -0400 (0:00:00.062) 0:05:14.017 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Friday 17 April 2026 20:14:28 -0400 (0:00:03.702) 0:05:17.720 ********** ok: [managed-node16] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...d6fca\\x2d4d87\\x2db62f\\x2dc32a1cc12a52.service": { "name": "systemd-cryptsetup@luk...d6fca\\x2d4d87\\x2db62f\\x2dc32a1cc12a52.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2d8b0f9973\\x2d6fca\\x2d4d87\\x2db62f\\x2dc32a1cc12a52.service": { "name": "systemd-cryptsetup@luks\\x2d8b0f9973\\x2d6fca\\x2d4d87\\x2db62f\\x2dc32a1cc12a52.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Friday 17 April 2026 20:14:31 -0400 (0:00:02.980) 0:05:20.700 ********** changed: [managed-node16] => (item=systemd-cryptsetup@luks\x2d8b0f9973\x2d6fca\x2d4d87\x2db62f\x2dc32a1cc12a52.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d8b0f9973\\x2d6fca\\x2d4d87\\x2db62f\\x2dc32a1cc12a52.service", "name": "systemd-cryptsetup@luks\\x2d8b0f9973\\x2d6fca\\x2d4d87\\x2db62f\\x2dc32a1cc12a52.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-journald.socket dev-sda.device system-systemd\\x2dcryptsetup.slice cryptsetup-pre.target", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-sda.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-8b0f9973-6fca-4d87-b62f-c32a1cc12a52", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-8b0f9973-6fca-4d87-b62f-c32a1cc12a52 /dev/sda - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-8b0f9973-6fca-4d87-b62f-c32a1cc12a52 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d8b0f9973\\x2d6fca\\x2d4d87\\x2db62f\\x2dc32a1cc12a52.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d8b0f9973\\x2d6fca\\x2d4d87\\x2db62f\\x2dc32a1cc12a52.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d8b0f9973\\x2d6fca\\x2d4d87\\x2db62f\\x2dc32a1cc12a52.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Fri 2026-04-17 20:13:26 EDT", "StateChangeTimestampMonotonic": "1864543340", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-sda.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node16] => (item=systemd-cryptsetup@luk...d6fca\x2d4d87\x2db62f\x2dc32a1cc12a52.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d6fca\\x2d4d87\\x2db62f\\x2dc32a1cc12a52.service", "name": "systemd-cryptsetup@luk...d6fca\\x2d4d87\\x2db62f\\x2dc32a1cc12a52.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d6fca\\x2d4d87\\x2db62f\\x2dc32a1cc12a52.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d6fca\\x2d4d87\\x2db62f\\x2dc32a1cc12a52.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...d6fca\\x2d4d87\\x2db62f\\x2dc32a1cc12a52.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d6fca\\x2d4d87\\x2db62f\\x2dc32a1cc12a52.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Friday 17 April 2026 20:14:35 -0400 (0:00:03.128) 0:05:23.828 ********** fatal: [managed-node16]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'sda' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Friday 17 April 2026 20:14:39 -0400 (0:00:04.438) 0:05:28.267 ********** fatal: [managed-node16]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'sda' in safe mode due to adding encryption", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [], 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10737418240, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Friday 17 April 2026 20:14:39 -0400 (0:00:00.117) 0:05:28.385 ********** changed: [managed-node16] => (item=systemd-cryptsetup@luks\x2d8b0f9973\x2d6fca\x2d4d87\x2db62f\x2dc32a1cc12a52.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d8b0f9973\\x2d6fca\\x2d4d87\\x2db62f\\x2dc32a1cc12a52.service", "name": "systemd-cryptsetup@luks\\x2d8b0f9973\\x2d6fca\\x2d4d87\\x2db62f\\x2dc32a1cc12a52.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d8b0f9973\\x2d6fca\\x2d4d87\\x2db62f\\x2dc32a1cc12a52.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d8b0f9973\\x2d6fca\\x2d4d87\\x2db62f\\x2dc32a1cc12a52.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d8b0f9973\\x2d6fca\\x2d4d87\\x2db62f\\x2dc32a1cc12a52.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d8b0f9973\\x2d6fca\\x2d4d87\\x2db62f\\x2dc32a1cc12a52.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node16] => (item=systemd-cryptsetup@luk...d6fca\x2d4d87\x2db62f\x2dc32a1cc12a52.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d6fca\\x2d4d87\\x2db62f\\x2dc32a1cc12a52.service", "name": "systemd-cryptsetup@luk...d6fca\\x2d4d87\\x2db62f\\x2dc32a1cc12a52.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d6fca\\x2d4d87\\x2db62f\\x2dc32a1cc12a52.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d6fca\\x2d4d87\\x2db62f\\x2dc32a1cc12a52.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...d6fca\\x2d4d87\\x2db62f\\x2dc32a1cc12a52.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d6fca\\x2d4d87\\x2db62f\\x2dc32a1cc12a52.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Friday 17 April 2026 20:14:41 -0400 (0:00:02.117) 0:05:30.502 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Friday 17 April 2026 20:14:41 -0400 (0:00:00.181) 0:05:30.684 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Friday 17 April 2026 20:14:42 -0400 (0:00:00.177) 0:05:30.861 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Friday 17 April 2026 20:14:42 -0400 (0:00:00.143) 0:05:31.005 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776471247.364723, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1776471247.364723, "dev": 2048, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1776471247.364723, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "281934679", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Friday 17 April 2026 20:14:43 -0400 (0:00:01.398) 0:05:32.403 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Add encryption to the volume] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:174 Friday 17 April 2026 20:14:44 -0400 (0:00:00.322) 0:05:32.726 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node16 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Friday 17 April 2026 20:14:44 -0400 (0:00:00.410) 0:05:33.137 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Friday 17 April 2026 20:14:44 -0400 (0:00:00.309) 0:05:33.446 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 April 2026 20:14:44 -0400 (0:00:00.231) 0:05:33.678 ********** ok: [managed-node16] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Friday 17 April 2026 20:14:46 -0400 (0:00:01.876) 0:05:35.555 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 April 2026 20:14:47 -0400 (0:00:00.207) 0:05:35.763 ********** ok: [managed-node16] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 April 2026 20:14:48 -0400 (0:00:01.610) 0:05:37.373 ********** skipping: [managed-node16] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node16] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 April 2026 20:14:49 -0400 (0:00:00.446) 0:05:37.819 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 April 2026 20:14:49 -0400 (0:00:00.270) 0:05:38.089 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 April 2026 20:14:49 -0400 (0:00:00.151) 0:05:38.241 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 April 2026 20:14:49 -0400 (0:00:00.127) 0:05:38.369 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Friday 17 April 2026 20:14:49 -0400 (0:00:00.188) 0:05:38.557 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 April 2026 20:14:50 -0400 (0:00:00.504) 0:05:39.062 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Friday 17 April 2026 20:14:50 -0400 (0:00:00.276) 0:05:39.339 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Friday 17 April 2026 20:14:50 -0400 (0:00:00.237) 0:05:39.576 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Friday 17 April 2026 20:14:55 -0400 (0:00:04.286) 0:05:43.863 ********** ok: [managed-node16] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 17 April 2026 20:14:55 -0400 (0:00:00.526) 0:05:44.390 ********** ok: [managed-node16] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 April 2026 20:14:55 -0400 (0:00:00.171) 0:05:44.561 ********** ok: [managed-node16] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Friday 17 April 2026 20:15:00 -0400 (0:00:04.987) 0:05:49.549 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 April 2026 20:15:01 -0400 (0:00:00.232) 0:05:49.781 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 April 2026 20:15:01 -0400 (0:00:00.142) 0:05:49.923 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 April 2026 20:15:01 -0400 (0:00:00.182) 0:05:50.105 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Friday 17 April 2026 20:15:01 -0400 (0:00:00.123) 0:05:50.229 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Friday 17 April 2026 20:15:05 -0400 (0:00:03.515) 0:05:53.745 ********** ok: [managed-node16] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Friday 17 April 2026 20:15:07 -0400 (0:00:02.355) 0:05:56.100 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Friday 17 April 2026 20:15:07 -0400 (0:00:00.291) 0:05:56.392 ********** changed: [managed-node16] => { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-97188206-9248-4be9-89ce-bf705b443643", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-97188206-9248-4be9-89ce-bf705b443643", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-97188206-9248-4be9-89ce-bf705b443643", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-97188206-9248-4be9-89ce-bf705b443643" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=0866b469-2e99-4871-99e5-9d1926d1d10c", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-97188206-9248-4be9-89ce-bf705b443643", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-97188206-9248-4be9-89ce-bf705b443643", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-97188206-9248-4be9-89ce-bf705b443643", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Friday 17 April 2026 20:15:21 -0400 (0:00:13.472) 0:06:09.865 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Friday 17 April 2026 20:15:21 -0400 (0:00:00.270) 0:06:10.135 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776471204.5605917, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "868d9effa202e37fea3a9344097baa0d4ff1d714", "ctime": 1776471204.557592, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 419430537, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776471204.557592, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1413, "uid": 0, "version": "571036750", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Friday 17 April 2026 20:15:22 -0400 (0:00:01.199) 0:06:11.334 ********** ok: [managed-node16] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Friday 17 April 2026 20:15:24 -0400 (0:00:01.508) 0:06:12.843 ********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Friday 17 April 2026 20:15:24 -0400 (0:00:00.359) 0:06:13.202 ********** ok: [managed-node16] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-97188206-9248-4be9-89ce-bf705b443643", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-97188206-9248-4be9-89ce-bf705b443643", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-97188206-9248-4be9-89ce-bf705b443643", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-97188206-9248-4be9-89ce-bf705b443643" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=0866b469-2e99-4871-99e5-9d1926d1d10c", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-97188206-9248-4be9-89ce-bf705b443643", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-97188206-9248-4be9-89ce-bf705b443643", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-97188206-9248-4be9-89ce-bf705b443643", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Friday 17 April 2026 20:15:24 -0400 (0:00:00.217) 0:06:13.420 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Friday 17 April 2026 20:15:24 -0400 (0:00:00.227) 0:06:13.648 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-97188206-9248-4be9-89ce-bf705b443643", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-97188206-9248-4be9-89ce-bf705b443643", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Friday 17 April 2026 20:15:25 -0400 (0:00:00.284) 0:06:13.933 ********** changed: [managed-node16] => (item={'src': 'UUID=0866b469-2e99-4871-99e5-9d1926d1d10c', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=0866b469-2e99-4871-99e5-9d1926d1d10c", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=0866b469-2e99-4871-99e5-9d1926d1d10c" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Friday 17 April 2026 20:15:26 -0400 (0:00:01.647) 0:06:15.581 ********** ok: [managed-node16] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Friday 17 April 2026 20:15:29 -0400 (0:00:02.168) 0:06:17.749 ********** changed: [managed-node16] => (item={'src': '/dev/mapper/luks-97188206-9248-4be9-89ce-bf705b443643', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-97188206-9248-4be9-89ce-bf705b443643", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-97188206-9248-4be9-89ce-bf705b443643" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Friday 17 April 2026 20:15:30 -0400 (0:00:01.462) 0:06:19.211 ********** skipping: [managed-node16] => (item={'src': '/dev/mapper/luks-97188206-9248-4be9-89ce-bf705b443643', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-97188206-9248-4be9-89ce-bf705b443643", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Friday 17 April 2026 20:15:30 -0400 (0:00:00.216) 0:06:19.428 ********** ok: [managed-node16] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Friday 17 April 2026 20:15:32 -0400 (0:00:01.733) 0:06:21.161 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776471217.994633, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1776471209.9346082, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 52429003, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1776471209.9326084, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "2419412962", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Friday 17 April 2026 20:15:33 -0400 (0:00:01.458) 0:06:22.620 ********** changed: [managed-node16] => (item={'backing_device': '/dev/sda', 'name': 'luks-97188206-9248-4be9-89ce-bf705b443643', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-97188206-9248-4be9-89ce-bf705b443643", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Friday 17 April 2026 20:15:35 -0400 (0:00:01.785) 0:06:24.405 ********** ok: [managed-node16] TASK [Verify role results - 3] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:186 Friday 17 April 2026 20:15:37 -0400 (0:00:02.239) 0:06:26.645 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node16 TASK [Print out pool information] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Friday 17 April 2026 20:15:38 -0400 (0:00:00.695) 0:06:27.341 ********** skipping: [managed-node16] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Friday 17 April 2026 20:15:38 -0400 (0:00:00.195) 0:06:27.536 ********** ok: [managed-node16] => { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-97188206-9248-4be9-89ce-bf705b443643", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-97188206-9248-4be9-89ce-bf705b443643", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Friday 17 April 2026 20:15:39 -0400 (0:00:00.255) 0:06:27.792 ********** ok: [managed-node16] => { "changed": false, "info": { "/dev/mapper/luks-97188206-9248-4be9-89ce-bf705b443643": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-97188206-9248-4be9-89ce-bf705b443643", "size": "10G", "type": "crypt", "uuid": "4096fdbc-b93e-44c1-8ae1-45792bcdf8ab" }, "/dev/sda": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "97188206-9248-4be9-89ce-bf705b443643" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Friday 17 April 2026 20:15:39 -0400 (0:00:00.872) 0:06:28.665 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002270", "end": "2026-04-17 20:15:40.789338", "rc": 0, "start": "2026-04-17 20:15:40.787068" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-97188206-9248-4be9-89ce-bf705b443643 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Friday 17 April 2026 20:15:41 -0400 (0:00:01.128) 0:06:29.793 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002515", "end": "2026-04-17 20:15:41.950970", "failed_when_result": false, "rc": 0, "start": "2026-04-17 20:15:41.948455" } STDOUT: luks-97188206-9248-4be9-89ce-bf705b443643 /dev/sda - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Friday 17 April 2026 20:15:42 -0400 (0:00:01.089) 0:06:30.882 ********** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Friday 17 April 2026 20:15:42 -0400 (0:00:00.210) 0:06:31.092 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node16 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Friday 17 April 2026 20:15:42 -0400 (0:00:00.330) 0:06:31.423 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Friday 17 April 2026 20:15:42 -0400 (0:00:00.252) 0:06:31.675 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node16 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Friday 17 April 2026 20:15:44 -0400 (0:00:01.116) 0:06:32.792 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-97188206-9248-4be9-89ce-bf705b443643" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Friday 17 April 2026 20:15:44 -0400 (0:00:00.273) 0:06:33.065 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Friday 17 April 2026 20:15:44 -0400 (0:00:00.247) 0:06:33.313 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Friday 17 April 2026 20:15:44 -0400 (0:00:00.281) 0:06:33.595 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Friday 17 April 2026 20:15:45 -0400 (0:00:00.380) 0:06:33.975 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Friday 17 April 2026 20:15:45 -0400 (0:00:00.286) 0:06:34.261 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Friday 17 April 2026 20:15:45 -0400 (0:00:00.247) 0:06:34.509 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Friday 17 April 2026 20:15:46 -0400 (0:00:00.258) 0:06:34.768 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Friday 17 April 2026 20:15:46 -0400 (0:00:00.304) 0:06:35.072 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Friday 17 April 2026 20:15:46 -0400 (0:00:00.326) 0:06:35.399 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Friday 17 April 2026 20:15:46 -0400 (0:00:00.299) 0:06:35.698 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Friday 17 April 2026 20:15:47 -0400 (0:00:00.248) 0:06:35.947 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-97188206-9248-4be9-89ce-bf705b443643 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Friday 17 April 2026 20:15:47 -0400 (0:00:00.359) 0:06:36.306 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Friday 17 April 2026 20:15:47 -0400 (0:00:00.176) 0:06:36.483 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Friday 17 April 2026 20:15:47 -0400 (0:00:00.220) 0:06:36.704 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Friday 17 April 2026 20:15:48 -0400 (0:00:00.246) 0:06:36.950 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Friday 17 April 2026 20:15:48 -0400 (0:00:00.270) 0:06:37.220 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Friday 17 April 2026 20:15:48 -0400 (0:00:00.186) 0:06:37.407 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Friday 17 April 2026 20:15:49 -0400 (0:00:00.355) 0:06:37.763 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Friday 17 April 2026 20:15:49 -0400 (0:00:00.427) 0:06:38.190 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776471320.7419436, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776471320.7419436, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 36719, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1776471320.7419436, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Friday 17 April 2026 20:15:50 -0400 (0:00:01.504) 0:06:39.694 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Friday 17 April 2026 20:15:51 -0400 (0:00:00.220) 0:06:39.914 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Friday 17 April 2026 20:15:51 -0400 (0:00:00.189) 0:06:40.104 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Friday 17 April 2026 20:15:51 -0400 (0:00:00.220) 0:06:40.325 ********** ok: [managed-node16] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Friday 17 April 2026 20:15:51 -0400 (0:00:00.263) 0:06:40.588 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Friday 17 April 2026 20:15:52 -0400 (0:00:00.215) 0:06:40.803 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Friday 17 April 2026 20:15:52 -0400 (0:00:00.165) 0:06:40.968 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776471320.8669438, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776471320.8669438, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 201495, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776471320.8669438, "nlink": 1, "path": "/dev/mapper/luks-97188206-9248-4be9-89ce-bf705b443643", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Friday 17 April 2026 20:15:53 -0400 (0:00:01.141) 0:06:42.110 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Friday 17 April 2026 20:15:57 -0400 (0:00:03.650) 0:06:45.761 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda" ], "delta": "0:00:00.009837", "end": "2026-04-17 20:15:58.219970", "rc": 0, "start": "2026-04-17 20:15:58.210133" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 97188206-9248-4be9-89ce-bf705b443643 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 930270 Threads: 2 Salt: 9e 73 60 e9 95 28 a8 b7 f9 54 2e ce 0b f4 34 6d cc ad 98 74 a5 53 a1 4c 4d a5 6a f1 a3 d0 cb 50 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 119809 Salt: 9a 88 6d c3 54 43 d5 dd 69 27 d6 3e 9a c0 b1 45 2a c4 09 72 de 5e 54 7d 0a dc f5 84 0e 31 5d f0 Digest: 62 fe 64 56 7c c8 84 5e 0e 11 67 cb c3 19 ae c1 72 7a fe 9c 63 65 3b 98 da 7a 44 f1 0d 41 cc 51 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Friday 17 April 2026 20:15:58 -0400 (0:00:01.451) 0:06:47.212 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Friday 17 April 2026 20:15:58 -0400 (0:00:00.234) 0:06:47.446 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Friday 17 April 2026 20:15:59 -0400 (0:00:00.282) 0:06:47.729 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Friday 17 April 2026 20:15:59 -0400 (0:00:00.177) 0:06:47.906 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Friday 17 April 2026 20:15:59 -0400 (0:00:00.162) 0:06:48.068 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Friday 17 April 2026 20:15:59 -0400 (0:00:00.225) 0:06:48.293 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Friday 17 April 2026 20:15:59 -0400 (0:00:00.204) 0:06:48.497 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Friday 17 April 2026 20:16:00 -0400 (0:00:00.288) 0:06:48.786 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-97188206-9248-4be9-89ce-bf705b443643 /dev/sda -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Friday 17 April 2026 20:16:00 -0400 (0:00:00.313) 0:06:49.100 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Friday 17 April 2026 20:16:00 -0400 (0:00:00.195) 0:06:49.295 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Friday 17 April 2026 20:16:00 -0400 (0:00:00.247) 0:06:49.543 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Friday 17 April 2026 20:16:01 -0400 (0:00:00.229) 0:06:49.772 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Friday 17 April 2026 20:16:01 -0400 (0:00:00.273) 0:06:50.046 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Friday 17 April 2026 20:16:01 -0400 (0:00:00.193) 0:06:50.240 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Friday 17 April 2026 20:16:01 -0400 (0:00:00.180) 0:06:50.420 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Friday 17 April 2026 20:16:01 -0400 (0:00:00.186) 0:06:50.607 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Friday 17 April 2026 20:16:02 -0400 (0:00:00.628) 0:06:51.235 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Friday 17 April 2026 20:16:02 -0400 (0:00:00.195) 0:06:51.431 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Friday 17 April 2026 20:16:02 -0400 (0:00:00.184) 0:06:51.616 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Friday 17 April 2026 20:16:03 -0400 (0:00:00.179) 0:06:51.795 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Friday 17 April 2026 20:16:03 -0400 (0:00:00.185) 0:06:51.981 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Friday 17 April 2026 20:16:03 -0400 (0:00:00.185) 0:06:52.166 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Friday 17 April 2026 20:16:03 -0400 (0:00:00.292) 0:06:52.459 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Friday 17 April 2026 20:16:03 -0400 (0:00:00.256) 0:06:52.716 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Friday 17 April 2026 20:16:04 -0400 (0:00:00.223) 0:06:52.940 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Friday 17 April 2026 20:16:04 -0400 (0:00:00.224) 0:06:53.165 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Friday 17 April 2026 20:16:04 -0400 (0:00:00.182) 0:06:53.347 ********** ok: [managed-node16] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Friday 17 April 2026 20:16:04 -0400 (0:00:00.178) 0:06:53.525 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Friday 17 April 2026 20:16:05 -0400 (0:00:00.251) 0:06:53.777 ********** skipping: [managed-node16] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Friday 17 April 2026 20:16:05 -0400 (0:00:00.317) 0:06:54.094 ********** skipping: [managed-node16] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Friday 17 April 2026 20:16:05 -0400 (0:00:00.266) 0:06:54.361 ********** skipping: [managed-node16] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Friday 17 April 2026 20:16:05 -0400 (0:00:00.237) 0:06:54.598 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Friday 17 April 2026 20:16:06 -0400 (0:00:00.209) 0:06:54.808 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Friday 17 April 2026 20:16:06 -0400 (0:00:00.355) 0:06:55.164 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Friday 17 April 2026 20:16:06 -0400 (0:00:00.258) 0:06:55.422 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Friday 17 April 2026 20:16:06 -0400 (0:00:00.218) 0:06:55.641 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Friday 17 April 2026 20:16:07 -0400 (0:00:00.242) 0:06:55.884 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Friday 17 April 2026 20:16:07 -0400 (0:00:00.221) 0:06:56.105 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Friday 17 April 2026 20:16:07 -0400 (0:00:00.239) 0:06:56.344 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Friday 17 April 2026 20:16:07 -0400 (0:00:00.164) 0:06:56.509 ********** skipping: [managed-node16] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Friday 17 April 2026 20:16:07 -0400 (0:00:00.193) 0:06:56.702 ********** skipping: [managed-node16] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Friday 17 April 2026 20:16:08 -0400 (0:00:00.187) 0:06:56.889 ********** skipping: [managed-node16] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Friday 17 April 2026 20:16:08 -0400 (0:00:00.278) 0:06:57.167 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Friday 17 April 2026 20:16:08 -0400 (0:00:00.314) 0:06:57.482 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Friday 17 April 2026 20:16:09 -0400 (0:00:00.257) 0:06:57.739 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Friday 17 April 2026 20:16:09 -0400 (0:00:00.250) 0:06:57.990 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Friday 17 April 2026 20:16:09 -0400 (0:00:00.324) 0:06:58.314 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Friday 17 April 2026 20:16:09 -0400 (0:00:00.298) 0:06:58.612 ********** ok: [managed-node16] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Friday 17 April 2026 20:16:10 -0400 (0:00:00.225) 0:06:58.838 ********** ok: [managed-node16] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Friday 17 April 2026 20:16:10 -0400 (0:00:00.182) 0:06:59.020 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Friday 17 April 2026 20:16:10 -0400 (0:00:00.316) 0:06:59.337 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Friday 17 April 2026 20:16:10 -0400 (0:00:00.246) 0:06:59.583 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Friday 17 April 2026 20:16:11 -0400 (0:00:00.266) 0:06:59.849 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Friday 17 April 2026 20:16:11 -0400 (0:00:00.264) 0:07:00.113 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Friday 17 April 2026 20:16:12 -0400 (0:00:00.639) 0:07:00.753 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Friday 17 April 2026 20:16:12 -0400 (0:00:00.193) 0:07:00.947 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Friday 17 April 2026 20:16:12 -0400 (0:00:00.207) 0:07:01.155 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Friday 17 April 2026 20:16:12 -0400 (0:00:00.262) 0:07:01.417 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Friday 17 April 2026 20:16:12 -0400 (0:00:00.221) 0:07:01.638 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Test for correct handling of new encrypted volume w/ no key - 2] ********* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:193 Friday 17 April 2026 20:16:13 -0400 (0:00:00.154) 0:07:01.792 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node16 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Friday 17 April 2026 20:16:13 -0400 (0:00:00.345) 0:07:02.138 ********** ok: [managed-node16] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Friday 17 April 2026 20:16:13 -0400 (0:00:00.186) 0:07:02.325 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node16 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Friday 17 April 2026 20:16:14 -0400 (0:00:00.414) 0:07:02.740 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Friday 17 April 2026 20:16:14 -0400 (0:00:00.160) 0:07:02.900 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 April 2026 20:16:14 -0400 (0:00:00.188) 0:07:03.089 ********** ok: [managed-node16] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Friday 17 April 2026 20:16:16 -0400 (0:00:01.743) 0:07:04.833 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 April 2026 20:16:16 -0400 (0:00:00.176) 0:07:05.009 ********** ok: [managed-node16] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 April 2026 20:16:18 -0400 (0:00:02.113) 0:07:07.123 ********** skipping: [managed-node16] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node16] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 April 2026 20:16:18 -0400 (0:00:00.414) 0:07:07.538 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 April 2026 20:16:19 -0400 (0:00:00.209) 0:07:07.748 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 April 2026 20:16:19 -0400 (0:00:00.124) 0:07:07.872 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 April 2026 20:16:19 -0400 (0:00:00.098) 0:07:07.971 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Friday 17 April 2026 20:16:19 -0400 (0:00:00.115) 0:07:08.087 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 April 2026 20:16:19 -0400 (0:00:00.428) 0:07:08.515 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Friday 17 April 2026 20:16:19 -0400 (0:00:00.199) 0:07:08.715 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Friday 17 April 2026 20:16:20 -0400 (0:00:00.157) 0:07:08.873 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Friday 17 April 2026 20:16:24 -0400 (0:00:03.961) 0:07:12.835 ********** ok: [managed-node16] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 17 April 2026 20:16:24 -0400 (0:00:00.202) 0:07:13.037 ********** ok: [managed-node16] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 April 2026 20:16:24 -0400 (0:00:00.241) 0:07:13.279 ********** ok: [managed-node16] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Friday 17 April 2026 20:16:29 -0400 (0:00:04.907) 0:07:18.186 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 April 2026 20:16:29 -0400 (0:00:00.248) 0:07:18.435 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 April 2026 20:16:29 -0400 (0:00:00.126) 0:07:18.562 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 April 2026 20:16:30 -0400 (0:00:00.182) 0:07:18.744 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Friday 17 April 2026 20:16:30 -0400 (0:00:00.112) 0:07:18.857 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Friday 17 April 2026 20:16:34 -0400 (0:00:04.316) 0:07:23.173 ********** ok: [managed-node16] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Friday 17 April 2026 20:16:37 -0400 (0:00:02.859) 0:07:26.033 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Friday 17 April 2026 20:16:37 -0400 (0:00:00.318) 0:07:26.351 ********** fatal: [managed-node16]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'test1' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Friday 17 April 2026 20:16:42 -0400 (0:00:05.386) 0:07:31.738 ********** fatal: [managed-node16]: FAILED! => { "changed": false } MSG: {'msg': "encrypted volume 'test1' missing key/password", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': False, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Friday 17 April 2026 20:16:43 -0400 (0:00:00.195) 0:07:31.933 ********** TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Friday 17 April 2026 20:16:43 -0400 (0:00:00.335) 0:07:32.269 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Friday 17 April 2026 20:16:43 -0400 (0:00:00.255) 0:07:32.524 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Friday 17 April 2026 20:16:44 -0400 (0:00:00.288) 0:07:32.813 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create an encrypted partition volume w/ default fs] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:212 Friday 17 April 2026 20:16:44 -0400 (0:00:00.126) 0:07:32.939 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node16 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Friday 17 April 2026 20:16:45 -0400 (0:00:00.922) 0:07:33.862 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Friday 17 April 2026 20:16:45 -0400 (0:00:00.176) 0:07:34.039 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 April 2026 20:16:45 -0400 (0:00:00.201) 0:07:34.240 ********** ok: [managed-node16] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Friday 17 April 2026 20:16:46 -0400 (0:00:01.429) 0:07:35.670 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 April 2026 20:16:47 -0400 (0:00:00.288) 0:07:35.959 ********** ok: [managed-node16] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 April 2026 20:16:49 -0400 (0:00:02.160) 0:07:38.119 ********** skipping: [managed-node16] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node16] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 April 2026 20:16:49 -0400 (0:00:00.238) 0:07:38.357 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 April 2026 20:16:49 -0400 (0:00:00.178) 0:07:38.536 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 April 2026 20:16:49 -0400 (0:00:00.105) 0:07:38.642 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 April 2026 20:16:50 -0400 (0:00:00.102) 0:07:38.745 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Friday 17 April 2026 20:16:50 -0400 (0:00:00.133) 0:07:38.879 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 April 2026 20:16:50 -0400 (0:00:00.391) 0:07:39.270 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Friday 17 April 2026 20:16:50 -0400 (0:00:00.181) 0:07:39.452 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Friday 17 April 2026 20:16:50 -0400 (0:00:00.155) 0:07:39.607 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Friday 17 April 2026 20:16:54 -0400 (0:00:03.936) 0:07:43.544 ********** ok: [managed-node16] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 17 April 2026 20:16:55 -0400 (0:00:00.287) 0:07:43.832 ********** ok: [managed-node16] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 April 2026 20:16:55 -0400 (0:00:00.274) 0:07:44.107 ********** ok: [managed-node16] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Friday 17 April 2026 20:17:00 -0400 (0:00:05.196) 0:07:49.303 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 April 2026 20:17:00 -0400 (0:00:00.416) 0:07:49.720 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 April 2026 20:17:01 -0400 (0:00:00.174) 0:07:49.894 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 April 2026 20:17:01 -0400 (0:00:00.239) 0:07:50.133 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Friday 17 April 2026 20:17:01 -0400 (0:00:00.144) 0:07:50.278 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Friday 17 April 2026 20:17:05 -0400 (0:00:04.371) 0:07:54.650 ********** ok: [managed-node16] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Friday 17 April 2026 20:17:08 -0400 (0:00:02.905) 0:07:57.556 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Friday 17 April 2026 20:17:09 -0400 (0:00:00.333) 0:07:57.889 ********** changed: [managed-node16] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-97188206-9248-4be9-89ce-bf705b443643", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-97188206-9248-4be9-89ce-bf705b443643", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-97188206-9248-4be9-89ce-bf705b443643", "password": "-", "state": "absent" }, { "backing_device": "/dev/sda1", "name": "luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-97188206-9248-4be9-89ce-bf705b443643", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Friday 17 April 2026 20:17:23 -0400 (0:00:14.267) 0:08:12.157 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Friday 17 April 2026 20:17:23 -0400 (0:00:00.341) 0:08:12.498 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776471330.2299721, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "ead4e4b9f8c85e27eb341f96da0dc0faf1b05a24", "ctime": 1776471330.226972, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 419430537, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776471330.226972, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "571036750", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Friday 17 April 2026 20:17:25 -0400 (0:00:01.384) 0:08:13.883 ********** ok: [managed-node16] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Friday 17 April 2026 20:17:26 -0400 (0:00:01.704) 0:08:15.587 ********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Friday 17 April 2026 20:17:27 -0400 (0:00:00.387) 0:08:15.974 ********** ok: [managed-node16] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-97188206-9248-4be9-89ce-bf705b443643", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-97188206-9248-4be9-89ce-bf705b443643", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-97188206-9248-4be9-89ce-bf705b443643", "password": "-", "state": "absent" }, { "backing_device": "/dev/sda1", "name": "luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-97188206-9248-4be9-89ce-bf705b443643", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Friday 17 April 2026 20:17:27 -0400 (0:00:00.193) 0:08:16.168 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Friday 17 April 2026 20:17:27 -0400 (0:00:00.245) 0:08:16.414 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Friday 17 April 2026 20:17:27 -0400 (0:00:00.297) 0:08:16.711 ********** changed: [managed-node16] => (item={'src': '/dev/mapper/luks-97188206-9248-4be9-89ce-bf705b443643', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-97188206-9248-4be9-89ce-bf705b443643", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-97188206-9248-4be9-89ce-bf705b443643" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Friday 17 April 2026 20:17:29 -0400 (0:00:01.747) 0:08:18.459 ********** ok: [managed-node16] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Friday 17 April 2026 20:17:31 -0400 (0:00:01.782) 0:08:20.241 ********** changed: [managed-node16] => (item={'src': '/dev/mapper/luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Friday 17 April 2026 20:17:33 -0400 (0:00:01.713) 0:08:21.955 ********** skipping: [managed-node16] => (item={'src': '/dev/mapper/luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Friday 17 April 2026 20:17:33 -0400 (0:00:00.345) 0:08:22.301 ********** ok: [managed-node16] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Friday 17 April 2026 20:17:35 -0400 (0:00:01.839) 0:08:24.141 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776471341.9500074, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "b4ab53d932b24ce21150210615f1f094a49f6f4f", "ctime": 1776471335.3659875, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 197132485, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1776471335.3659875, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 53, "uid": 0, "version": "1488712924", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Friday 17 April 2026 20:17:36 -0400 (0:00:01.540) 0:08:25.682 ********** changed: [managed-node16] => (item={'backing_device': '/dev/sda', 'name': 'luks-97188206-9248-4be9-89ce-bf705b443643', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-97188206-9248-4be9-89ce-bf705b443643", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed changed: [managed-node16] => (item={'backing_device': '/dev/sda1', 'name': 'luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Friday 17 April 2026 20:17:39 -0400 (0:00:02.823) 0:08:28.505 ********** ok: [managed-node16] TASK [Verify role results - 4] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:228 Friday 17 April 2026 20:17:41 -0400 (0:00:01.753) 0:08:30.259 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node16 TASK [Print out pool information] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Friday 17 April 2026 20:17:42 -0400 (0:00:00.480) 0:08:30.739 ********** ok: [managed-node16] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Friday 17 April 2026 20:17:42 -0400 (0:00:00.242) 0:08:30.982 ********** skipping: [managed-node16] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Friday 17 April 2026 20:17:42 -0400 (0:00:00.249) 0:08:31.232 ********** ok: [managed-node16] => { "changed": false, "info": { "/dev/mapper/luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798", "size": "4G", "type": "crypt", "uuid": "03b54d7e-b14c-4b7f-bc82-9bd2c17d83c0" }, "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda1", "size": "4G", "type": "partition", "uuid": "2f0698e3-f8a3-4aa6-9d24-bac03448f798" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Friday 17 April 2026 20:17:44 -0400 (0:00:01.664) 0:08:32.897 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002437", "end": "2026-04-17 20:17:45.447227", "rc": 0, "start": "2026-04-17 20:17:45.444790" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Friday 17 April 2026 20:17:45 -0400 (0:00:01.533) 0:08:34.431 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002449", "end": "2026-04-17 20:17:47.053059", "failed_when_result": false, "rc": 0, "start": "2026-04-17 20:17:47.050610" } STDOUT: luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798 /dev/sda1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Friday 17 April 2026 20:17:47 -0400 (0:00:01.600) 0:08:36.031 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node16 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Friday 17 April 2026 20:17:47 -0400 (0:00:00.474) 0:08:36.505 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Friday 17 April 2026 20:17:47 -0400 (0:00:00.151) 0:08:36.657 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Friday 17 April 2026 20:17:48 -0400 (0:00:00.285) 0:08:36.942 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Friday 17 April 2026 20:17:48 -0400 (0:00:00.361) 0:08:37.304 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node16 TASK [Set test variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Friday 17 April 2026 20:17:49 -0400 (0:00:00.478) 0:08:37.782 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Friday 17 April 2026 20:17:49 -0400 (0:00:00.212) 0:08:37.995 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Friday 17 April 2026 20:17:49 -0400 (0:00:00.291) 0:08:38.287 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Friday 17 April 2026 20:17:49 -0400 (0:00:00.256) 0:08:38.543 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Friday 17 April 2026 20:17:50 -0400 (0:00:00.218) 0:08:38.761 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Friday 17 April 2026 20:17:50 -0400 (0:00:00.259) 0:08:39.021 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Friday 17 April 2026 20:17:50 -0400 (0:00:00.274) 0:08:39.296 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Friday 17 April 2026 20:17:50 -0400 (0:00:00.187) 0:08:39.484 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Friday 17 April 2026 20:17:51 -0400 (0:00:00.276) 0:08:39.760 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Friday 17 April 2026 20:17:51 -0400 (0:00:00.262) 0:08:40.022 ********** ok: [managed-node16] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.9.63 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Friday 17 April 2026 20:17:52 -0400 (0:00:01.497) 0:08:41.520 ********** TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Friday 17 April 2026 20:17:53 -0400 (0:00:00.220) 0:08:41.741 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node16 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Friday 17 April 2026 20:17:53 -0400 (0:00:00.327) 0:08:42.068 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Friday 17 April 2026 20:17:53 -0400 (0:00:00.181) 0:08:42.250 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Friday 17 April 2026 20:17:53 -0400 (0:00:00.175) 0:08:42.425 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Friday 17 April 2026 20:17:53 -0400 (0:00:00.249) 0:08:42.675 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Friday 17 April 2026 20:17:54 -0400 (0:00:00.185) 0:08:42.860 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Friday 17 April 2026 20:17:54 -0400 (0:00:00.244) 0:08:43.105 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Friday 17 April 2026 20:17:54 -0400 (0:00:00.220) 0:08:43.326 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Friday 17 April 2026 20:17:54 -0400 (0:00:00.188) 0:08:43.514 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Friday 17 April 2026 20:17:54 -0400 (0:00:00.139) 0:08:43.654 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Friday 17 April 2026 20:17:55 -0400 (0:00:00.175) 0:08:43.829 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Friday 17 April 2026 20:17:55 -0400 (0:00:00.259) 0:08:44.089 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Friday 17 April 2026 20:17:55 -0400 (0:00:00.145) 0:08:44.235 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node16 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Friday 17 April 2026 20:17:55 -0400 (0:00:00.386) 0:08:44.621 ********** skipping: [managed-node16] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/mapper/luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Friday 17 April 2026 20:17:56 -0400 (0:00:00.417) 0:08:45.039 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node16 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Friday 17 April 2026 20:17:56 -0400 (0:00:00.524) 0:08:45.563 ********** skipping: [managed-node16] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/mapper/luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Friday 17 April 2026 20:17:57 -0400 (0:00:00.291) 0:08:45.855 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node16 TASK [Set test variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Friday 17 April 2026 20:17:57 -0400 (0:00:00.617) 0:08:46.472 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Friday 17 April 2026 20:17:58 -0400 (0:00:00.300) 0:08:46.772 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Friday 17 April 2026 20:17:58 -0400 (0:00:00.295) 0:08:47.068 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Friday 17 April 2026 20:17:58 -0400 (0:00:00.211) 0:08:47.279 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Friday 17 April 2026 20:17:58 -0400 (0:00:00.202) 0:08:47.482 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node16 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Friday 17 April 2026 20:17:59 -0400 (0:00:00.505) 0:08:47.988 ********** skipping: [managed-node16] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/mapper/luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Friday 17 April 2026 20:17:59 -0400 (0:00:00.375) 0:08:48.364 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node16 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Friday 17 April 2026 20:18:00 -0400 (0:00:00.461) 0:08:48.825 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Friday 17 April 2026 20:18:00 -0400 (0:00:00.243) 0:08:49.069 ********** skipping: [managed-node16] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Friday 17 April 2026 20:18:00 -0400 (0:00:00.311) 0:08:49.380 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Friday 17 April 2026 20:18:01 -0400 (0:00:00.366) 0:08:49.747 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Friday 17 April 2026 20:18:01 -0400 (0:00:00.203) 0:08:49.951 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Friday 17 April 2026 20:18:01 -0400 (0:00:00.287) 0:08:50.239 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Friday 17 April 2026 20:18:01 -0400 (0:00:00.395) 0:08:50.635 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Friday 17 April 2026 20:18:02 -0400 (0:00:00.226) 0:08:50.861 ********** ok: [managed-node16] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Friday 17 April 2026 20:18:02 -0400 (0:00:00.307) 0:08:51.169 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node16 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Friday 17 April 2026 20:18:02 -0400 (0:00:00.431) 0:08:51.600 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Friday 17 April 2026 20:18:03 -0400 (0:00:00.352) 0:08:51.953 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node16 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Friday 17 April 2026 20:18:04 -0400 (0:00:01.438) 0:08:53.391 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Friday 17 April 2026 20:18:05 -0400 (0:00:00.728) 0:08:54.120 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Friday 17 April 2026 20:18:05 -0400 (0:00:00.283) 0:08:54.404 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Friday 17 April 2026 20:18:06 -0400 (0:00:00.341) 0:08:54.745 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Friday 17 April 2026 20:18:06 -0400 (0:00:00.238) 0:08:54.984 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Friday 17 April 2026 20:18:06 -0400 (0:00:00.319) 0:08:55.304 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Friday 17 April 2026 20:18:06 -0400 (0:00:00.369) 0:08:55.674 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Friday 17 April 2026 20:18:07 -0400 (0:00:00.406) 0:08:56.080 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Friday 17 April 2026 20:18:07 -0400 (0:00:00.357) 0:08:56.437 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Friday 17 April 2026 20:18:08 -0400 (0:00:00.296) 0:08:56.734 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Friday 17 April 2026 20:18:08 -0400 (0:00:00.232) 0:08:56.966 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Friday 17 April 2026 20:18:08 -0400 (0:00:00.261) 0:08:57.228 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Friday 17 April 2026 20:18:09 -0400 (0:00:00.659) 0:08:57.888 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Friday 17 April 2026 20:18:09 -0400 (0:00:00.345) 0:08:58.233 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Friday 17 April 2026 20:18:09 -0400 (0:00:00.324) 0:08:58.558 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Friday 17 April 2026 20:18:10 -0400 (0:00:00.342) 0:08:58.901 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Friday 17 April 2026 20:18:10 -0400 (0:00:00.327) 0:08:59.229 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Friday 17 April 2026 20:18:10 -0400 (0:00:00.218) 0:08:59.447 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Friday 17 April 2026 20:18:11 -0400 (0:00:00.386) 0:08:59.833 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Friday 17 April 2026 20:18:11 -0400 (0:00:00.369) 0:09:00.203 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776471442.884311, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776471442.884311, "dev": 6, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 217134, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1776471442.884311, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Friday 17 April 2026 20:18:13 -0400 (0:00:01.528) 0:09:01.731 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Friday 17 April 2026 20:18:13 -0400 (0:00:00.336) 0:09:02.068 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Friday 17 April 2026 20:18:13 -0400 (0:00:00.406) 0:09:02.475 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Friday 17 April 2026 20:18:14 -0400 (0:00:00.328) 0:09:02.803 ********** ok: [managed-node16] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Friday 17 April 2026 20:18:14 -0400 (0:00:00.327) 0:09:03.130 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Friday 17 April 2026 20:18:14 -0400 (0:00:00.385) 0:09:03.515 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Friday 17 April 2026 20:18:15 -0400 (0:00:00.273) 0:09:03.789 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776471443.0403113, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776471443.0403113, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 217045, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776471443.0403113, "nlink": 1, "path": "/dev/mapper/luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Friday 17 April 2026 20:18:16 -0400 (0:00:01.291) 0:09:05.080 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Friday 17 April 2026 20:18:20 -0400 (0:00:04.008) 0:09:09.089 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda1" ], "delta": "0:00:00.010311", "end": "2026-04-17 20:18:21.521292", "rc": 0, "start": "2026-04-17 20:18:21.510981" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 2f0698e3-f8a3-4aa6-9d24-bac03448f798 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 937728 Threads: 2 Salt: 09 ed 90 86 6d cb 0c b0 d7 c9 47 0d 88 d8 29 fd 8e aa 0a c1 62 d3 07 8a b5 c5 e9 7c ab cd 4f 7c AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 119156 Salt: 1f b1 bd ba 93 1d b0 8a 9e 37 b0 cb af b2 36 86 58 29 ea e2 ef 4a 3f f0 4d 6d 00 09 3d e8 70 29 Digest: fd 87 24 ab 09 cd 4d 02 3d 06 06 4b e4 da 65 17 b3 2c e4 8a 08 71 cd a3 61 c6 a7 e2 af 25 2e cf TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Friday 17 April 2026 20:18:21 -0400 (0:00:01.386) 0:09:10.475 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Friday 17 April 2026 20:18:22 -0400 (0:00:00.262) 0:09:10.738 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Friday 17 April 2026 20:18:22 -0400 (0:00:00.318) 0:09:11.056 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Friday 17 April 2026 20:18:22 -0400 (0:00:00.234) 0:09:11.290 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Friday 17 April 2026 20:18:22 -0400 (0:00:00.282) 0:09:11.573 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Friday 17 April 2026 20:18:23 -0400 (0:00:00.265) 0:09:11.838 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Friday 17 April 2026 20:18:23 -0400 (0:00:00.277) 0:09:12.116 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Friday 17 April 2026 20:18:23 -0400 (0:00:00.300) 0:09:12.416 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798 /dev/sda1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Friday 17 April 2026 20:18:24 -0400 (0:00:00.376) 0:09:12.793 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Friday 17 April 2026 20:18:24 -0400 (0:00:00.248) 0:09:13.041 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Friday 17 April 2026 20:18:24 -0400 (0:00:00.290) 0:09:13.331 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Friday 17 April 2026 20:18:25 -0400 (0:00:00.408) 0:09:13.740 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Friday 17 April 2026 20:18:25 -0400 (0:00:00.572) 0:09:14.312 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Friday 17 April 2026 20:18:25 -0400 (0:00:00.237) 0:09:14.549 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Friday 17 April 2026 20:18:26 -0400 (0:00:00.374) 0:09:14.923 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Friday 17 April 2026 20:18:26 -0400 (0:00:00.269) 0:09:15.193 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Friday 17 April 2026 20:18:26 -0400 (0:00:00.225) 0:09:15.418 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Friday 17 April 2026 20:18:26 -0400 (0:00:00.272) 0:09:15.691 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Friday 17 April 2026 20:18:27 -0400 (0:00:00.206) 0:09:15.897 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Friday 17 April 2026 20:18:27 -0400 (0:00:00.258) 0:09:16.156 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Friday 17 April 2026 20:18:27 -0400 (0:00:00.234) 0:09:16.390 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Friday 17 April 2026 20:18:27 -0400 (0:00:00.182) 0:09:16.572 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Friday 17 April 2026 20:18:28 -0400 (0:00:00.297) 0:09:16.870 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Friday 17 April 2026 20:18:28 -0400 (0:00:00.186) 0:09:17.057 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Friday 17 April 2026 20:18:28 -0400 (0:00:00.215) 0:09:17.273 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Friday 17 April 2026 20:18:28 -0400 (0:00:00.178) 0:09:17.451 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Friday 17 April 2026 20:18:29 -0400 (0:00:00.381) 0:09:17.833 ********** ok: [managed-node16] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Friday 17 April 2026 20:18:29 -0400 (0:00:00.285) 0:09:18.118 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Friday 17 April 2026 20:18:29 -0400 (0:00:00.305) 0:09:18.424 ********** skipping: [managed-node16] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Friday 17 April 2026 20:18:30 -0400 (0:00:00.323) 0:09:18.748 ********** skipping: [managed-node16] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Friday 17 April 2026 20:18:30 -0400 (0:00:00.177) 0:09:18.926 ********** skipping: [managed-node16] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Friday 17 April 2026 20:18:30 -0400 (0:00:00.266) 0:09:19.192 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Friday 17 April 2026 20:18:30 -0400 (0:00:00.261) 0:09:19.454 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Friday 17 April 2026 20:18:30 -0400 (0:00:00.261) 0:09:19.716 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Friday 17 April 2026 20:18:31 -0400 (0:00:00.186) 0:09:19.902 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Friday 17 April 2026 20:18:31 -0400 (0:00:00.233) 0:09:20.136 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Friday 17 April 2026 20:18:31 -0400 (0:00:00.179) 0:09:20.315 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Friday 17 April 2026 20:18:31 -0400 (0:00:00.166) 0:09:20.482 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Friday 17 April 2026 20:18:31 -0400 (0:00:00.161) 0:09:20.643 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Friday 17 April 2026 20:18:32 -0400 (0:00:00.190) 0:09:20.834 ********** skipping: [managed-node16] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Friday 17 April 2026 20:18:32 -0400 (0:00:00.204) 0:09:21.038 ********** skipping: [managed-node16] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Friday 17 April 2026 20:18:32 -0400 (0:00:00.195) 0:09:21.234 ********** skipping: [managed-node16] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Friday 17 April 2026 20:18:32 -0400 (0:00:00.250) 0:09:21.484 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Friday 17 April 2026 20:18:33 -0400 (0:00:00.292) 0:09:21.777 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Friday 17 April 2026 20:18:33 -0400 (0:00:00.329) 0:09:22.106 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Friday 17 April 2026 20:18:33 -0400 (0:00:00.281) 0:09:22.387 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Friday 17 April 2026 20:18:33 -0400 (0:00:00.261) 0:09:22.649 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Friday 17 April 2026 20:18:34 -0400 (0:00:00.252) 0:09:22.902 ********** ok: [managed-node16] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Friday 17 April 2026 20:18:34 -0400 (0:00:00.188) 0:09:23.091 ********** ok: [managed-node16] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Friday 17 April 2026 20:18:34 -0400 (0:00:00.195) 0:09:23.286 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Friday 17 April 2026 20:18:34 -0400 (0:00:00.165) 0:09:23.451 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Friday 17 April 2026 20:18:34 -0400 (0:00:00.161) 0:09:23.613 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Friday 17 April 2026 20:18:35 -0400 (0:00:00.217) 0:09:23.830 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Friday 17 April 2026 20:18:35 -0400 (0:00:00.295) 0:09:24.125 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Friday 17 April 2026 20:18:35 -0400 (0:00:00.219) 0:09:24.345 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Friday 17 April 2026 20:18:35 -0400 (0:00:00.189) 0:09:24.534 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Friday 17 April 2026 20:18:36 -0400 (0:00:00.285) 0:09:24.820 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Friday 17 April 2026 20:18:36 -0400 (0:00:00.274) 0:09:25.095 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Friday 17 April 2026 20:18:36 -0400 (0:00:00.286) 0:09:25.381 ********** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Friday 17 April 2026 20:18:36 -0400 (0:00:00.194) 0:09:25.575 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Friday 17 April 2026 20:18:37 -0400 (0:00:00.181) 0:09:25.757 ********** changed: [managed-node16] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 3] ****************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:234 Friday 17 April 2026 20:18:38 -0400 (0:00:01.575) 0:09:27.332 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node16 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Friday 17 April 2026 20:18:39 -0400 (0:00:00.523) 0:09:27.856 ********** ok: [managed-node16] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Friday 17 April 2026 20:18:39 -0400 (0:00:00.219) 0:09:28.076 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node16 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Friday 17 April 2026 20:18:39 -0400 (0:00:00.167) 0:09:28.243 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Friday 17 April 2026 20:18:39 -0400 (0:00:00.143) 0:09:28.386 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 April 2026 20:18:39 -0400 (0:00:00.187) 0:09:28.573 ********** ok: [managed-node16] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Friday 17 April 2026 20:18:41 -0400 (0:00:01.896) 0:09:30.470 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 April 2026 20:18:41 -0400 (0:00:00.137) 0:09:30.608 ********** ok: [managed-node16] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 April 2026 20:18:43 -0400 (0:00:01.547) 0:09:32.155 ********** skipping: [managed-node16] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node16] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 April 2026 20:18:43 -0400 (0:00:00.478) 0:09:32.633 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 April 2026 20:18:44 -0400 (0:00:00.212) 0:09:32.846 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 April 2026 20:18:44 -0400 (0:00:00.193) 0:09:33.040 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 April 2026 20:18:44 -0400 (0:00:00.254) 0:09:33.294 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Friday 17 April 2026 20:18:44 -0400 (0:00:00.138) 0:09:33.432 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 April 2026 20:18:45 -0400 (0:00:00.650) 0:09:34.083 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Friday 17 April 2026 20:18:45 -0400 (0:00:00.260) 0:09:34.344 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Friday 17 April 2026 20:18:45 -0400 (0:00:00.287) 0:09:34.631 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Friday 17 April 2026 20:18:50 -0400 (0:00:04.180) 0:09:38.812 ********** ok: [managed-node16] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 17 April 2026 20:18:50 -0400 (0:00:00.299) 0:09:39.111 ********** ok: [managed-node16] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 April 2026 20:18:50 -0400 (0:00:00.259) 0:09:39.371 ********** ok: [managed-node16] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Friday 17 April 2026 20:18:55 -0400 (0:00:04.782) 0:09:44.153 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 April 2026 20:18:55 -0400 (0:00:00.279) 0:09:44.433 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 April 2026 20:18:55 -0400 (0:00:00.177) 0:09:44.610 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 April 2026 20:18:56 -0400 (0:00:00.164) 0:09:44.774 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Friday 17 April 2026 20:18:56 -0400 (0:00:00.153) 0:09:44.927 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Friday 17 April 2026 20:19:00 -0400 (0:00:04.343) 0:09:49.271 ********** ok: [managed-node16] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...d9248\\x2d4be9\\x2d89ce\\x2dbf705b443643.service": { "name": "systemd-cryptsetup@luk...d9248\\x2d4be9\\x2d89ce\\x2dbf705b443643.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2d97188206\\x2d9248\\x2d4be9\\x2d89ce\\x2dbf705b443643.service": { "name": "systemd-cryptsetup@luks\\x2d97188206\\x2d9248\\x2d4be9\\x2d89ce\\x2dbf705b443643.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Friday 17 April 2026 20:19:03 -0400 (0:00:03.344) 0:09:52.615 ********** changed: [managed-node16] => (item=systemd-cryptsetup@luks\x2d97188206\x2d9248\x2d4be9\x2d89ce\x2dbf705b443643.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d97188206\\x2d9248\\x2d4be9\\x2d89ce\\x2dbf705b443643.service", "name": "systemd-cryptsetup@luks\\x2d97188206\\x2d9248\\x2d4be9\\x2d89ce\\x2dbf705b443643.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-journald.socket cryptsetup-pre.target dev-sda.device system-systemd\\x2dcryptsetup.slice", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-sda.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-97188206-9248-4be9-89ce-bf705b443643", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-97188206-9248-4be9-89ce-bf705b443643 /dev/sda - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-97188206-9248-4be9-89ce-bf705b443643 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d97188206\\x2d9248\\x2d4be9\\x2d89ce\\x2dbf705b443643.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d97188206\\x2d9248\\x2d4be9\\x2d89ce\\x2dbf705b443643.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d97188206\\x2d9248\\x2d4be9\\x2d89ce\\x2dbf705b443643.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Fri 2026-04-17 20:17:35 EDT", "StateChangeTimestampMonotonic": "2113180194", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-sda.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node16] => (item=systemd-cryptsetup@luk...d9248\x2d4be9\x2d89ce\x2dbf705b443643.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d9248\\x2d4be9\\x2d89ce\\x2dbf705b443643.service", "name": "systemd-cryptsetup@luk...d9248\\x2d4be9\\x2d89ce\\x2dbf705b443643.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d9248\\x2d4be9\\x2d89ce\\x2dbf705b443643.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d9248\\x2d4be9\\x2d89ce\\x2dbf705b443643.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...d9248\\x2d4be9\\x2d89ce\\x2dbf705b443643.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d9248\\x2d4be9\\x2d89ce\\x2dbf705b443643.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Friday 17 April 2026 20:19:07 -0400 (0:00:04.049) 0:09:56.665 ********** fatal: [managed-node16]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Friday 17 April 2026 20:19:13 -0400 (0:00:05.114) 0:10:01.780 ********** fatal: [managed-node16]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798' in safe mode due to encryption removal", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Friday 17 April 2026 20:19:13 -0400 (0:00:00.262) 0:10:02.042 ********** changed: [managed-node16] => (item=systemd-cryptsetup@luks\x2d97188206\x2d9248\x2d4be9\x2d89ce\x2dbf705b443643.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d97188206\\x2d9248\\x2d4be9\\x2d89ce\\x2dbf705b443643.service", "name": "systemd-cryptsetup@luks\\x2d97188206\\x2d9248\\x2d4be9\\x2d89ce\\x2dbf705b443643.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d97188206\\x2d9248\\x2d4be9\\x2d89ce\\x2dbf705b443643.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d97188206\\x2d9248\\x2d4be9\\x2d89ce\\x2dbf705b443643.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d97188206\\x2d9248\\x2d4be9\\x2d89ce\\x2dbf705b443643.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d97188206\\x2d9248\\x2d4be9\\x2d89ce\\x2dbf705b443643.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node16] => (item=systemd-cryptsetup@luk...d9248\x2d4be9\x2d89ce\x2dbf705b443643.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d9248\\x2d4be9\\x2d89ce\\x2dbf705b443643.service", "name": "systemd-cryptsetup@luk...d9248\\x2d4be9\\x2d89ce\\x2dbf705b443643.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d9248\\x2d4be9\\x2d89ce\\x2dbf705b443643.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d9248\\x2d4be9\\x2d89ce\\x2dbf705b443643.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...d9248\\x2d4be9\\x2d89ce\\x2dbf705b443643.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d9248\\x2d4be9\\x2d89ce\\x2dbf705b443643.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Friday 17 April 2026 20:19:16 -0400 (0:00:03.235) 0:10:05.277 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Friday 17 April 2026 20:19:16 -0400 (0:00:00.283) 0:10:05.561 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Friday 17 April 2026 20:19:17 -0400 (0:00:00.350) 0:10:05.912 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Friday 17 April 2026 20:19:17 -0400 (0:00:00.228) 0:10:06.140 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776471518.330538, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1776471518.330538, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1776471518.330538, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "2050024692", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Friday 17 April 2026 20:19:18 -0400 (0:00:01.438) 0:10:07.579 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer - 2] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:258 Friday 17 April 2026 20:19:19 -0400 (0:00:00.247) 0:10:07.827 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node16 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Friday 17 April 2026 20:19:19 -0400 (0:00:00.733) 0:10:08.561 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Friday 17 April 2026 20:19:20 -0400 (0:00:00.262) 0:10:08.823 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 April 2026 20:19:20 -0400 (0:00:00.246) 0:10:09.070 ********** ok: [managed-node16] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Friday 17 April 2026 20:19:22 -0400 (0:00:01.704) 0:10:10.774 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 April 2026 20:19:22 -0400 (0:00:00.233) 0:10:11.007 ********** ok: [managed-node16] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 April 2026 20:19:24 -0400 (0:00:02.170) 0:10:13.178 ********** skipping: [managed-node16] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node16] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 April 2026 20:19:25 -0400 (0:00:00.550) 0:10:13.728 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 April 2026 20:19:25 -0400 (0:00:00.304) 0:10:14.032 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 April 2026 20:19:25 -0400 (0:00:00.233) 0:10:14.265 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 April 2026 20:19:25 -0400 (0:00:00.177) 0:10:14.443 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Friday 17 April 2026 20:19:25 -0400 (0:00:00.172) 0:10:14.615 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 April 2026 20:19:26 -0400 (0:00:00.392) 0:10:15.008 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Friday 17 April 2026 20:19:26 -0400 (0:00:00.192) 0:10:15.200 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Friday 17 April 2026 20:19:26 -0400 (0:00:00.228) 0:10:15.429 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Friday 17 April 2026 20:19:30 -0400 (0:00:04.229) 0:10:19.659 ********** ok: [managed-node16] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 17 April 2026 20:19:31 -0400 (0:00:00.129) 0:10:19.788 ********** ok: [managed-node16] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 April 2026 20:19:31 -0400 (0:00:00.155) 0:10:19.943 ********** ok: [managed-node16] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Friday 17 April 2026 20:19:36 -0400 (0:00:05.055) 0:10:24.999 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 April 2026 20:19:36 -0400 (0:00:00.350) 0:10:25.349 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 April 2026 20:19:36 -0400 (0:00:00.136) 0:10:25.486 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 April 2026 20:19:37 -0400 (0:00:00.264) 0:10:25.750 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Friday 17 April 2026 20:19:37 -0400 (0:00:00.181) 0:10:25.932 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Friday 17 April 2026 20:19:41 -0400 (0:00:03.950) 0:10:29.882 ********** ok: [managed-node16] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...df8a3\\x2d4aa6\\x2d9d24\\x2dbac03448f798.service": { "name": "systemd-cryptsetup@luk...df8a3\\x2d4aa6\\x2d9d24\\x2dbac03448f798.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2d2f0698e3\\x2df8a3\\x2d4aa6\\x2d9d24\\x2dbac03448f798.service": { "name": "systemd-cryptsetup@luks\\x2d2f0698e3\\x2df8a3\\x2d4aa6\\x2d9d24\\x2dbac03448f798.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Friday 17 April 2026 20:19:43 -0400 (0:00:02.697) 0:10:32.579 ********** changed: [managed-node16] => (item=systemd-cryptsetup@luks\x2d2f0698e3\x2df8a3\x2d4aa6\x2d9d24\x2dbac03448f798.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d2f0698e3\\x2df8a3\\x2d4aa6\\x2d9d24\\x2dbac03448f798.service", "name": "systemd-cryptsetup@luks\\x2d2f0698e3\\x2df8a3\\x2d4aa6\\x2d9d24\\x2dbac03448f798.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "cryptsetup-pre.target systemd-journald.socket system-systemd\\x2dcryptsetup.slice dev-sda1.device", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798 /dev/sda1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d2f0698e3\\x2df8a3\\x2d4aa6\\x2d9d24\\x2dbac03448f798.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d2f0698e3\\x2df8a3\\x2d4aa6\\x2d9d24\\x2dbac03448f798.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d2f0698e3\\x2df8a3\\x2d4aa6\\x2d9d24\\x2dbac03448f798.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "dev-mapper-luks\\x2d2f0698e3\\x2df8a3\\x2d4aa6\\x2d9d24\\x2dbac03448f798.device cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Fri 2026-04-17 20:19:07 EDT", "StateChangeTimestampMonotonic": "2205789071", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node16] => (item=systemd-cryptsetup@luk...df8a3\x2d4aa6\x2d9d24\x2dbac03448f798.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...df8a3\\x2d4aa6\\x2d9d24\\x2dbac03448f798.service", "name": "systemd-cryptsetup@luk...df8a3\\x2d4aa6\\x2d9d24\\x2dbac03448f798.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...df8a3\\x2d4aa6\\x2d9d24\\x2dbac03448f798.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...df8a3\\x2d4aa6\\x2d9d24\\x2dbac03448f798.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...df8a3\\x2d4aa6\\x2d9d24\\x2dbac03448f798.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...df8a3\\x2d4aa6\\x2d9d24\\x2dbac03448f798.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Friday 17 April 2026 20:19:47 -0400 (0:00:03.284) 0:10:35.864 ********** changed: [managed-node16] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=28b8e8b6-f62f-4a2e-8fe5-3a26b0fa8464", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=28b8e8b6-f62f-4a2e-8fe5-3a26b0fa8464", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Friday 17 April 2026 20:19:52 -0400 (0:00:05.536) 0:10:41.400 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Friday 17 April 2026 20:19:52 -0400 (0:00:00.263) 0:10:41.663 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776471452.9903414, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "93196dc36abfffd1808dc50b36aea50accf28352", "ctime": 1776471452.9873414, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 419430537, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776471452.9873414, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "571036750", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Friday 17 April 2026 20:19:54 -0400 (0:00:01.550) 0:10:43.214 ********** ok: [managed-node16] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Friday 17 April 2026 20:19:56 -0400 (0:00:01.785) 0:10:45.000 ********** changed: [managed-node16] => (item=systemd-cryptsetup@luks\x2d2f0698e3\x2df8a3\x2d4aa6\x2d9d24\x2dbac03448f798.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d2f0698e3\\x2df8a3\\x2d4aa6\\x2d9d24\\x2dbac03448f798.service", "name": "systemd-cryptsetup@luks\\x2d2f0698e3\\x2df8a3\\x2d4aa6\\x2d9d24\\x2dbac03448f798.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d2f0698e3\\x2df8a3\\x2d4aa6\\x2d9d24\\x2dbac03448f798.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d2f0698e3\\x2df8a3\\x2d4aa6\\x2d9d24\\x2dbac03448f798.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d2f0698e3\\x2df8a3\\x2d4aa6\\x2d9d24\\x2dbac03448f798.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d2f0698e3\\x2df8a3\\x2d4aa6\\x2d9d24\\x2dbac03448f798.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2d2f0698e3\\x2df8a3\\x2d4aa6\\x2d9d24\\x2dbac03448f798.device", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Fri 2026-04-17 20:19:07 EDT", "StateChangeTimestampMonotonic": "2205789071", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node16] => (item=systemd-cryptsetup@luk...df8a3\x2d4aa6\x2d9d24\x2dbac03448f798.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...df8a3\\x2d4aa6\\x2d9d24\\x2dbac03448f798.service", "name": "systemd-cryptsetup@luk...df8a3\\x2d4aa6\\x2d9d24\\x2dbac03448f798.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...df8a3\\x2d4aa6\\x2d9d24\\x2dbac03448f798.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...df8a3\\x2d4aa6\\x2d9d24\\x2dbac03448f798.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...df8a3\\x2d4aa6\\x2d9d24\\x2dbac03448f798.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...df8a3\\x2d4aa6\\x2d9d24\\x2dbac03448f798.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Friday 17 April 2026 20:20:00 -0400 (0:00:03.861) 0:10:48.861 ********** ok: [managed-node16] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=28b8e8b6-f62f-4a2e-8fe5-3a26b0fa8464", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=28b8e8b6-f62f-4a2e-8fe5-3a26b0fa8464", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Friday 17 April 2026 20:20:00 -0400 (0:00:00.208) 0:10:49.070 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=28b8e8b6-f62f-4a2e-8fe5-3a26b0fa8464", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Friday 17 April 2026 20:20:00 -0400 (0:00:00.190) 0:10:49.261 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Friday 17 April 2026 20:20:00 -0400 (0:00:00.257) 0:10:49.519 ********** changed: [managed-node16] => (item={'src': '/dev/mapper/luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Friday 17 April 2026 20:20:02 -0400 (0:00:01.459) 0:10:50.978 ********** ok: [managed-node16] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Friday 17 April 2026 20:20:04 -0400 (0:00:01.999) 0:10:52.978 ********** changed: [managed-node16] => (item={'src': 'UUID=28b8e8b6-f62f-4a2e-8fe5-3a26b0fa8464', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=28b8e8b6-f62f-4a2e-8fe5-3a26b0fa8464", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=28b8e8b6-f62f-4a2e-8fe5-3a26b0fa8464" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Friday 17 April 2026 20:20:05 -0400 (0:00:01.657) 0:10:54.635 ********** skipping: [managed-node16] => (item={'src': 'UUID=28b8e8b6-f62f-4a2e-8fe5-3a26b0fa8464', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=28b8e8b6-f62f-4a2e-8fe5-3a26b0fa8464", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Friday 17 April 2026 20:20:06 -0400 (0:00:00.292) 0:10:54.927 ********** ok: [managed-node16] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Friday 17 April 2026 20:20:08 -0400 (0:00:02.078) 0:10:57.006 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776471467.0523837, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "f7e6c5cf868101b31eed70a034e355c8cf8bdacb", "ctime": 1776471459.5343611, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 329252997, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1776471459.5343611, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 54, "uid": 0, "version": "1465340182", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Friday 17 April 2026 20:20:09 -0400 (0:00:01.450) 0:10:58.456 ********** changed: [managed-node16] => (item={'backing_device': '/dev/sda1', 'name': 'luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Friday 17 April 2026 20:20:11 -0400 (0:00:01.473) 0:10:59.930 ********** ok: [managed-node16] TASK [Verify role results - 5] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:274 Friday 17 April 2026 20:20:13 -0400 (0:00:01.814) 0:11:01.744 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node16 TASK [Print out pool information] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Friday 17 April 2026 20:20:13 -0400 (0:00:00.543) 0:11:02.287 ********** ok: [managed-node16] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=28b8e8b6-f62f-4a2e-8fe5-3a26b0fa8464", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Friday 17 April 2026 20:20:13 -0400 (0:00:00.186) 0:11:02.474 ********** skipping: [managed-node16] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Friday 17 April 2026 20:20:13 -0400 (0:00:00.212) 0:11:02.687 ********** ok: [managed-node16] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/sda1", "size": "4G", "type": "partition", "uuid": "28b8e8b6-f62f-4a2e-8fe5-3a26b0fa8464" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Friday 17 April 2026 20:20:15 -0400 (0:00:01.464) 0:11:04.151 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002516", "end": "2026-04-17 20:20:16.801331", "rc": 0, "start": "2026-04-17 20:20:16.798815" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 UUID=28b8e8b6-f62f-4a2e-8fe5-3a26b0fa8464 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Friday 17 April 2026 20:20:17 -0400 (0:00:01.709) 0:11:05.860 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:01.003562", "end": "2026-04-17 20:20:19.526336", "failed_when_result": false, "rc": 0, "start": "2026-04-17 20:20:18.522774" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Friday 17 April 2026 20:20:19 -0400 (0:00:02.627) 0:11:08.488 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node16 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Friday 17 April 2026 20:20:20 -0400 (0:00:00.499) 0:11:08.988 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Friday 17 April 2026 20:20:20 -0400 (0:00:00.157) 0:11:09.146 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Friday 17 April 2026 20:20:20 -0400 (0:00:00.198) 0:11:09.344 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Friday 17 April 2026 20:20:20 -0400 (0:00:00.333) 0:11:09.678 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node16 TASK [Set test variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Friday 17 April 2026 20:20:21 -0400 (0:00:00.532) 0:11:10.211 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Friday 17 April 2026 20:20:21 -0400 (0:00:00.274) 0:11:10.486 ********** TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Friday 17 April 2026 20:20:21 -0400 (0:00:00.217) 0:11:10.703 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Friday 17 April 2026 20:20:22 -0400 (0:00:00.349) 0:11:11.052 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Friday 17 April 2026 20:20:22 -0400 (0:00:00.299) 0:11:11.351 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Friday 17 April 2026 20:20:22 -0400 (0:00:00.322) 0:11:11.673 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Friday 17 April 2026 20:20:23 -0400 (0:00:00.285) 0:11:11.959 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Friday 17 April 2026 20:20:23 -0400 (0:00:00.234) 0:11:12.193 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Friday 17 April 2026 20:20:23 -0400 (0:00:00.229) 0:11:12.423 ********** TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Friday 17 April 2026 20:20:23 -0400 (0:00:00.249) 0:11:12.673 ********** ok: [managed-node16] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.9.63 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Friday 17 April 2026 20:20:25 -0400 (0:00:01.712) 0:11:14.385 ********** TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Friday 17 April 2026 20:20:25 -0400 (0:00:00.328) 0:11:14.714 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node16 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Friday 17 April 2026 20:20:26 -0400 (0:00:00.400) 0:11:15.115 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Friday 17 April 2026 20:20:26 -0400 (0:00:00.322) 0:11:15.438 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Friday 17 April 2026 20:20:26 -0400 (0:00:00.260) 0:11:15.699 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Friday 17 April 2026 20:20:27 -0400 (0:00:00.214) 0:11:15.913 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Friday 17 April 2026 20:20:27 -0400 (0:00:00.230) 0:11:16.143 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Friday 17 April 2026 20:20:27 -0400 (0:00:00.246) 0:11:16.389 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Friday 17 April 2026 20:20:27 -0400 (0:00:00.260) 0:11:16.649 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Friday 17 April 2026 20:20:28 -0400 (0:00:00.234) 0:11:16.884 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Friday 17 April 2026 20:20:28 -0400 (0:00:00.299) 0:11:17.183 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Friday 17 April 2026 20:20:28 -0400 (0:00:00.232) 0:11:17.415 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Friday 17 April 2026 20:20:28 -0400 (0:00:00.161) 0:11:17.577 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Friday 17 April 2026 20:20:29 -0400 (0:00:00.196) 0:11:17.774 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node16 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Friday 17 April 2026 20:20:29 -0400 (0:00:00.441) 0:11:18.215 ********** skipping: [managed-node16] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=28b8e8b6-f62f-4a2e-8fe5-3a26b0fa8464', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=28b8e8b6-f62f-4a2e-8fe5-3a26b0fa8464", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Friday 17 April 2026 20:20:29 -0400 (0:00:00.288) 0:11:18.504 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node16 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Friday 17 April 2026 20:20:30 -0400 (0:00:00.455) 0:11:18.960 ********** skipping: [managed-node16] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=28b8e8b6-f62f-4a2e-8fe5-3a26b0fa8464', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=28b8e8b6-f62f-4a2e-8fe5-3a26b0fa8464", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Friday 17 April 2026 20:20:30 -0400 (0:00:00.194) 0:11:19.154 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node16 TASK [Set test variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Friday 17 April 2026 20:20:30 -0400 (0:00:00.496) 0:11:19.651 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Friday 17 April 2026 20:20:31 -0400 (0:00:00.272) 0:11:19.924 ********** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Friday 17 April 2026 20:20:31 -0400 (0:00:00.193) 0:11:20.118 ********** TASK [Clear test variables] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Friday 17 April 2026 20:20:31 -0400 (0:00:00.204) 0:11:20.323 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Friday 17 April 2026 20:20:31 -0400 (0:00:00.246) 0:11:20.569 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node16 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Friday 17 April 2026 20:20:32 -0400 (0:00:00.532) 0:11:21.102 ********** skipping: [managed-node16] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=28b8e8b6-f62f-4a2e-8fe5-3a26b0fa8464', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=28b8e8b6-f62f-4a2e-8fe5-3a26b0fa8464", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Friday 17 April 2026 20:20:32 -0400 (0:00:00.459) 0:11:21.561 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node16 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Friday 17 April 2026 20:20:33 -0400 (0:00:00.641) 0:11:22.203 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Friday 17 April 2026 20:20:33 -0400 (0:00:00.193) 0:11:22.397 ********** skipping: [managed-node16] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Friday 17 April 2026 20:20:34 -0400 (0:00:00.945) 0:11:23.342 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Friday 17 April 2026 20:20:34 -0400 (0:00:00.247) 0:11:23.590 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Friday 17 April 2026 20:20:35 -0400 (0:00:00.282) 0:11:23.872 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Friday 17 April 2026 20:20:35 -0400 (0:00:00.273) 0:11:24.146 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Friday 17 April 2026 20:20:35 -0400 (0:00:00.337) 0:11:24.483 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Friday 17 April 2026 20:20:35 -0400 (0:00:00.216) 0:11:24.700 ********** ok: [managed-node16] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Friday 17 April 2026 20:20:36 -0400 (0:00:00.185) 0:11:24.885 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node16 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Friday 17 April 2026 20:20:36 -0400 (0:00:00.449) 0:11:25.334 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Friday 17 April 2026 20:20:36 -0400 (0:00:00.264) 0:11:25.599 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node16 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Friday 17 April 2026 20:20:38 -0400 (0:00:01.276) 0:11:26.875 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_device_path": "/dev/sda1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Friday 17 April 2026 20:20:38 -0400 (0:00:00.225) 0:11:27.100 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Friday 17 April 2026 20:20:38 -0400 (0:00:00.289) 0:11:27.390 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Friday 17 April 2026 20:20:39 -0400 (0:00:00.376) 0:11:27.766 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Friday 17 April 2026 20:20:39 -0400 (0:00:00.336) 0:11:28.103 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Friday 17 April 2026 20:20:39 -0400 (0:00:00.135) 0:11:28.238 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Friday 17 April 2026 20:20:39 -0400 (0:00:00.162) 0:11:28.401 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Friday 17 April 2026 20:20:39 -0400 (0:00:00.221) 0:11:28.623 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Friday 17 April 2026 20:20:40 -0400 (0:00:00.265) 0:11:28.889 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Friday 17 April 2026 20:20:40 -0400 (0:00:00.290) 0:11:29.179 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Friday 17 April 2026 20:20:40 -0400 (0:00:00.243) 0:11:29.422 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Friday 17 April 2026 20:20:41 -0400 (0:00:00.432) 0:11:29.855 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=28b8e8b6-f62f-4a2e-8fe5-3a26b0fa8464 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Friday 17 April 2026 20:20:41 -0400 (0:00:00.559) 0:11:30.415 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Friday 17 April 2026 20:20:41 -0400 (0:00:00.252) 0:11:30.667 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Friday 17 April 2026 20:20:42 -0400 (0:00:00.223) 0:11:30.891 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Friday 17 April 2026 20:20:42 -0400 (0:00:00.275) 0:11:31.166 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Friday 17 April 2026 20:20:42 -0400 (0:00:00.380) 0:11:31.546 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Friday 17 April 2026 20:20:43 -0400 (0:00:00.295) 0:11:31.842 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Friday 17 April 2026 20:20:43 -0400 (0:00:00.402) 0:11:32.244 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Friday 17 April 2026 20:20:43 -0400 (0:00:00.258) 0:11:32.503 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776471592.4367623, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776471592.4367623, "dev": 6, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 235054, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1776471592.4367623, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Friday 17 April 2026 20:20:45 -0400 (0:00:01.440) 0:11:33.943 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Friday 17 April 2026 20:20:45 -0400 (0:00:00.229) 0:11:34.173 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Friday 17 April 2026 20:20:45 -0400 (0:00:00.254) 0:11:34.428 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Friday 17 April 2026 20:20:45 -0400 (0:00:00.256) 0:11:34.684 ********** ok: [managed-node16] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Friday 17 April 2026 20:20:46 -0400 (0:00:00.246) 0:11:34.930 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Friday 17 April 2026 20:20:46 -0400 (0:00:00.276) 0:11:35.207 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Friday 17 April 2026 20:20:46 -0400 (0:00:00.281) 0:11:35.489 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Friday 17 April 2026 20:20:47 -0400 (0:00:00.253) 0:11:35.743 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Friday 17 April 2026 20:20:51 -0400 (0:00:04.406) 0:11:40.149 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Friday 17 April 2026 20:20:51 -0400 (0:00:00.352) 0:11:40.502 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Friday 17 April 2026 20:20:52 -0400 (0:00:00.239) 0:11:40.741 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Friday 17 April 2026 20:20:52 -0400 (0:00:00.233) 0:11:40.975 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Friday 17 April 2026 20:20:52 -0400 (0:00:00.161) 0:11:41.136 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Friday 17 April 2026 20:20:52 -0400 (0:00:00.188) 0:11:41.325 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Friday 17 April 2026 20:20:52 -0400 (0:00:00.262) 0:11:41.588 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Friday 17 April 2026 20:20:53 -0400 (0:00:00.144) 0:11:41.732 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Friday 17 April 2026 20:20:53 -0400 (0:00:00.293) 0:11:42.025 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Friday 17 April 2026 20:20:53 -0400 (0:00:00.382) 0:11:42.408 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Friday 17 April 2026 20:20:53 -0400 (0:00:00.300) 0:11:42.709 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Friday 17 April 2026 20:20:54 -0400 (0:00:00.224) 0:11:42.933 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Friday 17 April 2026 20:20:54 -0400 (0:00:00.303) 0:11:43.236 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Friday 17 April 2026 20:20:54 -0400 (0:00:00.304) 0:11:43.541 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Friday 17 April 2026 20:20:55 -0400 (0:00:00.379) 0:11:43.921 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Friday 17 April 2026 20:20:55 -0400 (0:00:00.323) 0:11:44.244 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Friday 17 April 2026 20:20:55 -0400 (0:00:00.272) 0:11:44.516 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Friday 17 April 2026 20:20:55 -0400 (0:00:00.183) 0:11:44.700 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Friday 17 April 2026 20:20:56 -0400 (0:00:00.255) 0:11:44.955 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Friday 17 April 2026 20:20:56 -0400 (0:00:00.227) 0:11:45.183 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Friday 17 April 2026 20:20:56 -0400 (0:00:00.380) 0:11:45.564 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Friday 17 April 2026 20:20:57 -0400 (0:00:00.199) 0:11:45.763 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Friday 17 April 2026 20:20:57 -0400 (0:00:00.211) 0:11:45.975 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Friday 17 April 2026 20:20:57 -0400 (0:00:00.233) 0:11:46.208 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Friday 17 April 2026 20:20:57 -0400 (0:00:00.189) 0:11:46.398 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Friday 17 April 2026 20:20:57 -0400 (0:00:00.239) 0:11:46.637 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Friday 17 April 2026 20:20:58 -0400 (0:00:00.229) 0:11:46.867 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Friday 17 April 2026 20:20:58 -0400 (0:00:00.339) 0:11:47.206 ********** ok: [managed-node16] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Friday 17 April 2026 20:20:58 -0400 (0:00:00.204) 0:11:47.410 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Friday 17 April 2026 20:20:58 -0400 (0:00:00.266) 0:11:47.677 ********** skipping: [managed-node16] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Friday 17 April 2026 20:20:59 -0400 (0:00:00.266) 0:11:47.943 ********** skipping: [managed-node16] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Friday 17 April 2026 20:20:59 -0400 (0:00:00.243) 0:11:48.186 ********** skipping: [managed-node16] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Friday 17 April 2026 20:20:59 -0400 (0:00:00.240) 0:11:48.427 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Friday 17 April 2026 20:20:59 -0400 (0:00:00.299) 0:11:48.726 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Friday 17 April 2026 20:21:00 -0400 (0:00:00.277) 0:11:49.003 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Friday 17 April 2026 20:21:00 -0400 (0:00:00.205) 0:11:49.209 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Friday 17 April 2026 20:21:00 -0400 (0:00:00.268) 0:11:49.477 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Friday 17 April 2026 20:21:01 -0400 (0:00:00.303) 0:11:49.780 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Friday 17 April 2026 20:21:01 -0400 (0:00:00.286) 0:11:50.066 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Friday 17 April 2026 20:21:01 -0400 (0:00:00.180) 0:11:50.247 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Friday 17 April 2026 20:21:01 -0400 (0:00:00.201) 0:11:50.449 ********** skipping: [managed-node16] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Friday 17 April 2026 20:21:01 -0400 (0:00:00.195) 0:11:50.644 ********** skipping: [managed-node16] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Friday 17 April 2026 20:21:02 -0400 (0:00:00.189) 0:11:50.834 ********** skipping: [managed-node16] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Friday 17 April 2026 20:21:02 -0400 (0:00:00.233) 0:11:51.068 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Friday 17 April 2026 20:21:02 -0400 (0:00:00.226) 0:11:51.294 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Friday 17 April 2026 20:21:02 -0400 (0:00:00.203) 0:11:51.497 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Friday 17 April 2026 20:21:03 -0400 (0:00:00.245) 0:11:51.743 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Friday 17 April 2026 20:21:03 -0400 (0:00:00.206) 0:11:51.949 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Friday 17 April 2026 20:21:03 -0400 (0:00:00.161) 0:11:52.111 ********** ok: [managed-node16] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Friday 17 April 2026 20:21:03 -0400 (0:00:00.244) 0:11:52.356 ********** ok: [managed-node16] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Friday 17 April 2026 20:21:03 -0400 (0:00:00.284) 0:11:52.640 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Friday 17 April 2026 20:21:04 -0400 (0:00:00.267) 0:11:52.907 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Friday 17 April 2026 20:21:04 -0400 (0:00:00.295) 0:11:53.203 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Friday 17 April 2026 20:21:04 -0400 (0:00:00.231) 0:11:53.435 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Friday 17 April 2026 20:21:04 -0400 (0:00:00.150) 0:11:53.586 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Friday 17 April 2026 20:21:05 -0400 (0:00:00.160) 0:11:53.747 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Friday 17 April 2026 20:21:05 -0400 (0:00:00.221) 0:11:53.968 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Friday 17 April 2026 20:21:05 -0400 (0:00:00.216) 0:11:54.185 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Friday 17 April 2026 20:21:05 -0400 (0:00:00.170) 0:11:54.355 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Friday 17 April 2026 20:21:05 -0400 (0:00:00.126) 0:11:54.482 ********** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Friday 17 April 2026 20:21:05 -0400 (0:00:00.157) 0:11:54.640 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Friday 17 April 2026 20:21:06 -0400 (0:00:00.113) 0:11:54.753 ********** changed: [managed-node16] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 4] ****************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:280 Friday 17 April 2026 20:21:07 -0400 (0:00:01.312) 0:11:56.065 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node16 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Friday 17 April 2026 20:21:07 -0400 (0:00:00.441) 0:11:56.507 ********** ok: [managed-node16] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Friday 17 April 2026 20:21:08 -0400 (0:00:00.221) 0:11:56.728 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node16 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Friday 17 April 2026 20:21:08 -0400 (0:00:00.289) 0:11:57.017 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Friday 17 April 2026 20:21:08 -0400 (0:00:00.258) 0:11:57.276 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 April 2026 20:21:08 -0400 (0:00:00.211) 0:11:57.488 ********** ok: [managed-node16] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Friday 17 April 2026 20:21:10 -0400 (0:00:01.733) 0:11:59.221 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 April 2026 20:21:10 -0400 (0:00:00.180) 0:11:59.402 ********** ok: [managed-node16] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 April 2026 20:21:12 -0400 (0:00:01.815) 0:12:01.217 ********** skipping: [managed-node16] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node16] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 April 2026 20:21:12 -0400 (0:00:00.430) 0:12:01.648 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 April 2026 20:21:13 -0400 (0:00:00.151) 0:12:01.799 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 April 2026 20:21:13 -0400 (0:00:00.181) 0:12:01.980 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 April 2026 20:21:13 -0400 (0:00:00.122) 0:12:02.103 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Friday 17 April 2026 20:21:13 -0400 (0:00:00.111) 0:12:02.214 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 April 2026 20:21:13 -0400 (0:00:00.380) 0:12:02.595 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Friday 17 April 2026 20:21:14 -0400 (0:00:00.138) 0:12:02.734 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Friday 17 April 2026 20:21:14 -0400 (0:00:00.096) 0:12:02.830 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Friday 17 April 2026 20:21:18 -0400 (0:00:04.170) 0:12:07.001 ********** ok: [managed-node16] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 17 April 2026 20:21:18 -0400 (0:00:00.188) 0:12:07.190 ********** ok: [managed-node16] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 April 2026 20:21:18 -0400 (0:00:00.258) 0:12:07.449 ********** ok: [managed-node16] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Friday 17 April 2026 20:21:23 -0400 (0:00:05.238) 0:12:12.687 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 April 2026 20:21:24 -0400 (0:00:00.442) 0:12:13.129 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 April 2026 20:21:25 -0400 (0:00:00.871) 0:12:14.001 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 April 2026 20:21:25 -0400 (0:00:00.218) 0:12:14.219 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Friday 17 April 2026 20:21:25 -0400 (0:00:00.176) 0:12:14.395 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Friday 17 April 2026 20:21:29 -0400 (0:00:04.291) 0:12:18.687 ********** ok: [managed-node16] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...df8a3\\x2d4aa6\\x2d9d24\\x2dbac03448f798.service": { "name": "systemd-cryptsetup@luk...df8a3\\x2d4aa6\\x2d9d24\\x2dbac03448f798.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2d2f0698e3\\x2df8a3\\x2d4aa6\\x2d9d24\\x2dbac03448f798.service": { "name": "systemd-cryptsetup@luks\\x2d2f0698e3\\x2df8a3\\x2d4aa6\\x2d9d24\\x2dbac03448f798.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Friday 17 April 2026 20:21:32 -0400 (0:00:02.538) 0:12:21.226 ********** changed: [managed-node16] => (item=systemd-cryptsetup@luks\x2d2f0698e3\x2df8a3\x2d4aa6\x2d9d24\x2dbac03448f798.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d2f0698e3\\x2df8a3\\x2d4aa6\\x2d9d24\\x2dbac03448f798.service", "name": "systemd-cryptsetup@luks\\x2d2f0698e3\\x2df8a3\\x2d4aa6\\x2d9d24\\x2dbac03448f798.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "dev-sda1.device systemd-journald.socket cryptsetup-pre.target system-systemd\\x2dcryptsetup.slice", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798 /dev/sda1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d2f0698e3\\x2df8a3\\x2d4aa6\\x2d9d24\\x2dbac03448f798.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d2f0698e3\\x2df8a3\\x2d4aa6\\x2d9d24\\x2dbac03448f798.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d2f0698e3\\x2df8a3\\x2d4aa6\\x2d9d24\\x2dbac03448f798.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Fri 2026-04-17 20:19:07 EDT", "StateChangeTimestampMonotonic": "2205789071", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node16] => (item=systemd-cryptsetup@luk...df8a3\x2d4aa6\x2d9d24\x2dbac03448f798.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...df8a3\\x2d4aa6\\x2d9d24\\x2dbac03448f798.service", "name": "systemd-cryptsetup@luk...df8a3\\x2d4aa6\\x2d9d24\\x2dbac03448f798.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...df8a3\\x2d4aa6\\x2d9d24\\x2dbac03448f798.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...df8a3\\x2d4aa6\\x2d9d24\\x2dbac03448f798.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...df8a3\\x2d4aa6\\x2d9d24\\x2dbac03448f798.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...df8a3\\x2d4aa6\\x2d9d24\\x2dbac03448f798.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Friday 17 April 2026 20:21:35 -0400 (0:00:03.323) 0:12:24.549 ********** fatal: [managed-node16]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Friday 17 April 2026 20:21:40 -0400 (0:00:05.044) 0:12:29.593 ********** fatal: [managed-node16]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Friday 17 April 2026 20:21:41 -0400 (0:00:00.336) 0:12:29.930 ********** changed: [managed-node16] => (item=systemd-cryptsetup@luks\x2d2f0698e3\x2df8a3\x2d4aa6\x2d9d24\x2dbac03448f798.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d2f0698e3\\x2df8a3\\x2d4aa6\\x2d9d24\\x2dbac03448f798.service", "name": "systemd-cryptsetup@luks\\x2d2f0698e3\\x2df8a3\\x2d4aa6\\x2d9d24\\x2dbac03448f798.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d2f0698e3\\x2df8a3\\x2d4aa6\\x2d9d24\\x2dbac03448f798.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d2f0698e3\\x2df8a3\\x2d4aa6\\x2d9d24\\x2dbac03448f798.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d2f0698e3\\x2df8a3\\x2d4aa6\\x2d9d24\\x2dbac03448f798.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d2f0698e3\\x2df8a3\\x2d4aa6\\x2d9d24\\x2dbac03448f798.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node16] => (item=systemd-cryptsetup@luk...df8a3\x2d4aa6\x2d9d24\x2dbac03448f798.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...df8a3\\x2d4aa6\\x2d9d24\\x2dbac03448f798.service", "name": "systemd-cryptsetup@luk...df8a3\\x2d4aa6\\x2d9d24\\x2dbac03448f798.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...df8a3\\x2d4aa6\\x2d9d24\\x2dbac03448f798.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...df8a3\\x2d4aa6\\x2d9d24\\x2dbac03448f798.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...df8a3\\x2d4aa6\\x2d9d24\\x2dbac03448f798.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...df8a3\\x2d4aa6\\x2d9d24\\x2dbac03448f798.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Friday 17 April 2026 20:21:44 -0400 (0:00:03.735) 0:12:33.666 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Friday 17 April 2026 20:21:45 -0400 (0:00:00.232) 0:12:33.898 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Friday 17 April 2026 20:21:45 -0400 (0:00:00.349) 0:12:34.247 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Friday 17 April 2026 20:21:45 -0400 (0:00:00.237) 0:12:34.485 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776471667.0839884, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1776471667.0839884, "dev": 2049, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1776471667.0839884, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "1877156620", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Friday 17 April 2026 20:21:47 -0400 (0:00:01.458) 0:12:35.944 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Create a key file] ******************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:306 Friday 17 April 2026 20:21:47 -0400 (0:00:00.231) 0:12:36.175 ********** ok: [managed-node16] => { "changed": false, "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/tmp/storage_testa62n2quhlukskey", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Write the key into the key file] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:313 Friday 17 April 2026 20:21:50 -0400 (0:00:03.275) 0:12:39.451 ********** ok: [managed-node16] => { "changed": false, "checksum": "7a4dff3752e2baf5617c57eaac048e2b95e8af91", "dest": "/tmp/storage_testa62n2quhlukskey", "gid": 0, "group": "root", "md5sum": "4ac07b967150835c00d0865161e48744", "mode": "0600", "owner": "root", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 32, "src": "/root/.ansible/tmp/ansible-tmp-1776471711.108652-203947-236754163458575/source", "state": "file", "uid": 0 } TASK [Add encryption to the volume - 2] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:320 Friday 17 April 2026 20:21:54 -0400 (0:00:03.574) 0:12:43.025 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node16 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Friday 17 April 2026 20:21:54 -0400 (0:00:00.284) 0:12:43.309 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Friday 17 April 2026 20:21:54 -0400 (0:00:00.204) 0:12:43.513 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 April 2026 20:21:55 -0400 (0:00:00.287) 0:12:43.801 ********** ok: [managed-node16] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Friday 17 April 2026 20:21:57 -0400 (0:00:02.061) 0:12:45.863 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 April 2026 20:21:57 -0400 (0:00:00.303) 0:12:46.166 ********** ok: [managed-node16] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 April 2026 20:21:59 -0400 (0:00:02.175) 0:12:48.342 ********** skipping: [managed-node16] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node16] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 April 2026 20:21:59 -0400 (0:00:00.360) 0:12:48.702 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 April 2026 20:22:00 -0400 (0:00:00.171) 0:12:48.873 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 April 2026 20:22:00 -0400 (0:00:00.254) 0:12:49.127 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 April 2026 20:22:00 -0400 (0:00:00.400) 0:12:49.527 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Friday 17 April 2026 20:22:01 -0400 (0:00:00.211) 0:12:49.739 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 April 2026 20:22:01 -0400 (0:00:00.534) 0:12:50.273 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Friday 17 April 2026 20:22:01 -0400 (0:00:00.158) 0:12:50.432 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Friday 17 April 2026 20:22:01 -0400 (0:00:00.162) 0:12:50.595 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Friday 17 April 2026 20:22:05 -0400 (0:00:04.058) 0:12:54.653 ********** ok: [managed-node16] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_key": "/tmp/storage_testa62n2quhlukskey", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 17 April 2026 20:22:06 -0400 (0:00:00.227) 0:12:54.881 ********** ok: [managed-node16] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 April 2026 20:22:06 -0400 (0:00:00.167) 0:12:55.049 ********** ok: [managed-node16] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Friday 17 April 2026 20:22:11 -0400 (0:00:05.134) 0:13:00.184 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 April 2026 20:22:11 -0400 (0:00:00.325) 0:13:00.509 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 April 2026 20:22:11 -0400 (0:00:00.099) 0:13:00.608 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 April 2026 20:22:12 -0400 (0:00:00.212) 0:13:00.821 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Friday 17 April 2026 20:22:12 -0400 (0:00:00.195) 0:13:01.017 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Friday 17 April 2026 20:22:16 -0400 (0:00:04.547) 0:13:05.565 ********** ok: [managed-node16] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Friday 17 April 2026 20:22:19 -0400 (0:00:03.065) 0:13:08.631 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Friday 17 April 2026 20:22:20 -0400 (0:00:00.360) 0:13:08.992 ********** changed: [managed-node16] => { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-7a5c617a-1200-4191-9727-a5bffa31fbb7", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-7a5c617a-1200-4191-9727-a5bffa31fbb7", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-7a5c617a-1200-4191-9727-a5bffa31fbb7", "password": "/tmp/storage_testa62n2quhlukskey", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-7a5c617a-1200-4191-9727-a5bffa31fbb7" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=28b8e8b6-f62f-4a2e-8fe5-3a26b0fa8464", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-7a5c617a-1200-4191-9727-a5bffa31fbb7", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-7a5c617a-1200-4191-9727-a5bffa31fbb7", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-7a5c617a-1200-4191-9727-a5bffa31fbb7", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testa62n2quhlukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Friday 17 April 2026 20:22:34 -0400 (0:00:14.018) 0:13:23.010 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Friday 17 April 2026 20:22:34 -0400 (0:00:00.259) 0:13:23.269 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776471605.5818021, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "a26f7aba490f0771fa8fe75a3c9083b58abb2a53", "ctime": 1776471605.578802, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 419430537, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776471605.578802, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1413, "uid": 0, "version": "571036750", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Friday 17 April 2026 20:22:36 -0400 (0:00:01.650) 0:13:24.919 ********** ok: [managed-node16] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Friday 17 April 2026 20:22:37 -0400 (0:00:01.725) 0:13:26.645 ********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Friday 17 April 2026 20:22:38 -0400 (0:00:00.378) 0:13:27.024 ********** ok: [managed-node16] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-7a5c617a-1200-4191-9727-a5bffa31fbb7", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-7a5c617a-1200-4191-9727-a5bffa31fbb7", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-7a5c617a-1200-4191-9727-a5bffa31fbb7", "password": "/tmp/storage_testa62n2quhlukskey", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-7a5c617a-1200-4191-9727-a5bffa31fbb7" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=28b8e8b6-f62f-4a2e-8fe5-3a26b0fa8464", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-7a5c617a-1200-4191-9727-a5bffa31fbb7", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-7a5c617a-1200-4191-9727-a5bffa31fbb7", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-7a5c617a-1200-4191-9727-a5bffa31fbb7", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testa62n2quhlukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Friday 17 April 2026 20:22:38 -0400 (0:00:00.200) 0:13:27.224 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-7a5c617a-1200-4191-9727-a5bffa31fbb7", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-7a5c617a-1200-4191-9727-a5bffa31fbb7", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testa62n2quhlukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Friday 17 April 2026 20:22:38 -0400 (0:00:00.145) 0:13:27.370 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Friday 17 April 2026 20:22:38 -0400 (0:00:00.132) 0:13:27.503 ********** changed: [managed-node16] => (item={'src': 'UUID=28b8e8b6-f62f-4a2e-8fe5-3a26b0fa8464', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=28b8e8b6-f62f-4a2e-8fe5-3a26b0fa8464", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=28b8e8b6-f62f-4a2e-8fe5-3a26b0fa8464" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Friday 17 April 2026 20:22:39 -0400 (0:00:01.157) 0:13:28.660 ********** ok: [managed-node16] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Friday 17 April 2026 20:22:41 -0400 (0:00:01.866) 0:13:30.527 ********** changed: [managed-node16] => (item={'src': '/dev/mapper/luks-7a5c617a-1200-4191-9727-a5bffa31fbb7', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-7a5c617a-1200-4191-9727-a5bffa31fbb7", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-7a5c617a-1200-4191-9727-a5bffa31fbb7" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Friday 17 April 2026 20:22:43 -0400 (0:00:01.577) 0:13:32.104 ********** skipping: [managed-node16] => (item={'src': '/dev/mapper/luks-7a5c617a-1200-4191-9727-a5bffa31fbb7', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-7a5c617a-1200-4191-9727-a5bffa31fbb7", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Friday 17 April 2026 20:22:43 -0400 (0:00:00.245) 0:13:32.350 ********** ok: [managed-node16] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Friday 17 April 2026 20:22:44 -0400 (0:00:01.368) 0:13:33.719 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776471618.5238414, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1776471610.9388185, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 501219476, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1776471610.9378185, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "715759064", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Friday 17 April 2026 20:22:46 -0400 (0:00:01.386) 0:13:35.106 ********** changed: [managed-node16] => (item={'backing_device': '/dev/sda1', 'name': 'luks-7a5c617a-1200-4191-9727-a5bffa31fbb7', 'password': '/tmp/storage_testa62n2quhlukskey', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-7a5c617a-1200-4191-9727-a5bffa31fbb7", "password": "/tmp/storage_testa62n2quhlukskey", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Friday 17 April 2026 20:22:47 -0400 (0:00:01.496) 0:13:36.603 ********** ok: [managed-node16] TASK [Verify role results - 6] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:336 Friday 17 April 2026 20:22:49 -0400 (0:00:01.795) 0:13:38.398 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node16 TASK [Print out pool information] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Friday 17 April 2026 20:22:49 -0400 (0:00:00.215) 0:13:38.614 ********** ok: [managed-node16] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-7a5c617a-1200-4191-9727-a5bffa31fbb7", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-7a5c617a-1200-4191-9727-a5bffa31fbb7", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testa62n2quhlukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Friday 17 April 2026 20:22:50 -0400 (0:00:00.140) 0:13:38.754 ********** skipping: [managed-node16] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Friday 17 April 2026 20:22:50 -0400 (0:00:00.210) 0:13:38.965 ********** ok: [managed-node16] => { "changed": false, "info": { "/dev/mapper/luks-7a5c617a-1200-4191-9727-a5bffa31fbb7": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-7a5c617a-1200-4191-9727-a5bffa31fbb7", "size": "4G", "type": "crypt", "uuid": "528331a2-339a-4c85-a424-fbb03ed0dff9" }, "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda1", "size": "4G", "type": "partition", "uuid": "7a5c617a-1200-4191-9727-a5bffa31fbb7" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Friday 17 April 2026 20:22:52 -0400 (0:00:01.809) 0:13:40.775 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002420", "end": "2026-04-17 20:22:53.240929", "rc": 0, "start": "2026-04-17 20:22:53.238509" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-7a5c617a-1200-4191-9727-a5bffa31fbb7 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Friday 17 April 2026 20:22:53 -0400 (0:00:01.450) 0:13:42.226 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002249", "end": "2026-04-17 20:22:54.574665", "failed_when_result": false, "rc": 0, "start": "2026-04-17 20:22:54.572416" } STDOUT: luks-7a5c617a-1200-4191-9727-a5bffa31fbb7 /dev/sda1 /tmp/storage_testa62n2quhlukskey TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Friday 17 April 2026 20:22:54 -0400 (0:00:01.330) 0:13:43.556 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node16 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Friday 17 April 2026 20:22:55 -0400 (0:00:00.330) 0:13:43.887 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Friday 17 April 2026 20:22:55 -0400 (0:00:00.081) 0:13:43.969 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Friday 17 April 2026 20:22:55 -0400 (0:00:00.151) 0:13:44.120 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Friday 17 April 2026 20:22:55 -0400 (0:00:00.205) 0:13:44.325 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node16 TASK [Set test variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Friday 17 April 2026 20:22:56 -0400 (0:00:00.452) 0:13:44.778 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Friday 17 April 2026 20:22:56 -0400 (0:00:00.226) 0:13:45.005 ********** TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Friday 17 April 2026 20:22:56 -0400 (0:00:00.165) 0:13:45.170 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Friday 17 April 2026 20:22:56 -0400 (0:00:00.257) 0:13:45.428 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Friday 17 April 2026 20:22:56 -0400 (0:00:00.220) 0:13:45.649 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Friday 17 April 2026 20:22:57 -0400 (0:00:00.148) 0:13:45.797 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Friday 17 April 2026 20:22:57 -0400 (0:00:00.209) 0:13:46.006 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Friday 17 April 2026 20:22:57 -0400 (0:00:00.243) 0:13:46.250 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Friday 17 April 2026 20:22:57 -0400 (0:00:00.187) 0:13:46.438 ********** TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Friday 17 April 2026 20:22:57 -0400 (0:00:00.188) 0:13:46.627 ********** ok: [managed-node16] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.9.63 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Friday 17 April 2026 20:22:59 -0400 (0:00:01.371) 0:13:47.998 ********** TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Friday 17 April 2026 20:22:59 -0400 (0:00:00.211) 0:13:48.210 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node16 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Friday 17 April 2026 20:22:59 -0400 (0:00:00.428) 0:13:48.638 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Friday 17 April 2026 20:23:00 -0400 (0:00:00.238) 0:13:48.876 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Friday 17 April 2026 20:23:00 -0400 (0:00:00.281) 0:13:49.157 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Friday 17 April 2026 20:23:00 -0400 (0:00:00.214) 0:13:49.372 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Friday 17 April 2026 20:23:00 -0400 (0:00:00.122) 0:13:49.494 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Friday 17 April 2026 20:23:00 -0400 (0:00:00.094) 0:13:49.589 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Friday 17 April 2026 20:23:01 -0400 (0:00:00.181) 0:13:49.771 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Friday 17 April 2026 20:23:01 -0400 (0:00:00.177) 0:13:49.948 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Friday 17 April 2026 20:23:01 -0400 (0:00:00.129) 0:13:50.078 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Friday 17 April 2026 20:23:01 -0400 (0:00:00.228) 0:13:50.306 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Friday 17 April 2026 20:23:01 -0400 (0:00:00.213) 0:13:50.519 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Friday 17 April 2026 20:23:01 -0400 (0:00:00.161) 0:13:50.681 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node16 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Friday 17 April 2026 20:23:02 -0400 (0:00:00.273) 0:13:50.954 ********** skipping: [managed-node16] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_testa62n2quhlukskey', 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-7a5c617a-1200-4191-9727-a5bffa31fbb7', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-7a5c617a-1200-4191-9727-a5bffa31fbb7', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/mapper/luks-7a5c617a-1200-4191-9727-a5bffa31fbb7", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-7a5c617a-1200-4191-9727-a5bffa31fbb7", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testa62n2quhlukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Friday 17 April 2026 20:23:02 -0400 (0:00:00.201) 0:13:51.156 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node16 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Friday 17 April 2026 20:23:02 -0400 (0:00:00.351) 0:13:51.507 ********** skipping: [managed-node16] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_testa62n2quhlukskey', 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-7a5c617a-1200-4191-9727-a5bffa31fbb7', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-7a5c617a-1200-4191-9727-a5bffa31fbb7', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/mapper/luks-7a5c617a-1200-4191-9727-a5bffa31fbb7", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-7a5c617a-1200-4191-9727-a5bffa31fbb7", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testa62n2quhlukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Friday 17 April 2026 20:23:03 -0400 (0:00:00.230) 0:13:51.737 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node16 TASK [Set test variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Friday 17 April 2026 20:23:03 -0400 (0:00:00.340) 0:13:52.078 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Friday 17 April 2026 20:23:03 -0400 (0:00:00.208) 0:13:52.287 ********** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Friday 17 April 2026 20:23:03 -0400 (0:00:00.191) 0:13:52.478 ********** TASK [Clear test variables] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Friday 17 April 2026 20:23:03 -0400 (0:00:00.216) 0:13:52.694 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Friday 17 April 2026 20:23:04 -0400 (0:00:00.213) 0:13:52.908 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node16 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Friday 17 April 2026 20:23:04 -0400 (0:00:00.494) 0:13:53.402 ********** skipping: [managed-node16] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_testa62n2quhlukskey', 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-7a5c617a-1200-4191-9727-a5bffa31fbb7', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-7a5c617a-1200-4191-9727-a5bffa31fbb7', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/mapper/luks-7a5c617a-1200-4191-9727-a5bffa31fbb7", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-7a5c617a-1200-4191-9727-a5bffa31fbb7", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testa62n2quhlukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Friday 17 April 2026 20:23:04 -0400 (0:00:00.262) 0:13:53.664 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node16 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Friday 17 April 2026 20:23:05 -0400 (0:00:00.567) 0:13:54.231 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Friday 17 April 2026 20:23:05 -0400 (0:00:00.246) 0:13:54.478 ********** skipping: [managed-node16] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Friday 17 April 2026 20:23:05 -0400 (0:00:00.171) 0:13:54.649 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Friday 17 April 2026 20:23:06 -0400 (0:00:00.289) 0:13:54.939 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Friday 17 April 2026 20:23:06 -0400 (0:00:00.276) 0:13:55.215 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Friday 17 April 2026 20:23:06 -0400 (0:00:00.206) 0:13:55.422 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Friday 17 April 2026 20:23:06 -0400 (0:00:00.180) 0:13:55.603 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Friday 17 April 2026 20:23:07 -0400 (0:00:00.170) 0:13:55.774 ********** ok: [managed-node16] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Friday 17 April 2026 20:23:07 -0400 (0:00:00.151) 0:13:55.925 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node16 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Friday 17 April 2026 20:23:07 -0400 (0:00:00.433) 0:13:56.359 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Friday 17 April 2026 20:23:07 -0400 (0:00:00.289) 0:13:56.648 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node16 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Friday 17 April 2026 20:23:09 -0400 (0:00:01.346) 0:13:57.995 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-7a5c617a-1200-4191-9727-a5bffa31fbb7" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Friday 17 April 2026 20:23:09 -0400 (0:00:00.201) 0:13:58.196 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Friday 17 April 2026 20:23:09 -0400 (0:00:00.181) 0:13:58.378 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Friday 17 April 2026 20:23:10 -0400 (0:00:00.532) 0:13:58.910 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Friday 17 April 2026 20:23:10 -0400 (0:00:00.275) 0:13:59.185 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Friday 17 April 2026 20:23:10 -0400 (0:00:00.300) 0:13:59.486 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Friday 17 April 2026 20:23:11 -0400 (0:00:00.377) 0:13:59.863 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Friday 17 April 2026 20:23:11 -0400 (0:00:00.398) 0:14:00.262 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Friday 17 April 2026 20:23:11 -0400 (0:00:00.288) 0:14:00.551 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Friday 17 April 2026 20:23:12 -0400 (0:00:00.231) 0:14:00.783 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Friday 17 April 2026 20:23:12 -0400 (0:00:00.207) 0:14:00.990 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Friday 17 April 2026 20:23:12 -0400 (0:00:00.125) 0:14:01.116 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-7a5c617a-1200-4191-9727-a5bffa31fbb7 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Friday 17 April 2026 20:23:12 -0400 (0:00:00.582) 0:14:01.699 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Friday 17 April 2026 20:23:13 -0400 (0:00:00.193) 0:14:01.893 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Friday 17 April 2026 20:23:13 -0400 (0:00:00.251) 0:14:02.144 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Friday 17 April 2026 20:23:13 -0400 (0:00:00.186) 0:14:02.331 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Friday 17 April 2026 20:23:13 -0400 (0:00:00.268) 0:14:02.599 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Friday 17 April 2026 20:23:14 -0400 (0:00:00.319) 0:14:02.918 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Friday 17 April 2026 20:23:14 -0400 (0:00:00.385) 0:14:03.304 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Friday 17 April 2026 20:23:14 -0400 (0:00:00.411) 0:14:03.715 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776471753.842251, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776471753.842251, "dev": 6, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 235054, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1776471753.842251, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Friday 17 April 2026 20:23:16 -0400 (0:00:01.509) 0:14:05.225 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Friday 17 April 2026 20:23:16 -0400 (0:00:00.146) 0:14:05.371 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Friday 17 April 2026 20:23:17 -0400 (0:00:00.893) 0:14:06.265 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Friday 17 April 2026 20:23:17 -0400 (0:00:00.174) 0:14:06.439 ********** ok: [managed-node16] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Friday 17 April 2026 20:23:17 -0400 (0:00:00.184) 0:14:06.623 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Friday 17 April 2026 20:23:18 -0400 (0:00:00.296) 0:14:06.919 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Friday 17 April 2026 20:23:18 -0400 (0:00:00.257) 0:14:07.177 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776471753.9992516, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776471753.9992516, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 252966, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776471753.9992516, "nlink": 1, "path": "/dev/mapper/luks-7a5c617a-1200-4191-9727-a5bffa31fbb7", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Friday 17 April 2026 20:23:19 -0400 (0:00:01.177) 0:14:08.354 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Friday 17 April 2026 20:23:23 -0400 (0:00:04.086) 0:14:12.441 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda1" ], "delta": "0:00:00.010018", "end": "2026-04-17 20:23:24.963920", "rc": 0, "start": "2026-04-17 20:23:24.953902" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 7a5c617a-1200-4191-9727-a5bffa31fbb7 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 930270 Threads: 2 Salt: 6c cd 2b e9 be 57 e4 4c a9 7b 1d 81 9f 8e 10 bd 71 1e f0 46 b3 a9 42 83 3d 71 9c 5e 6e 05 ee b8 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 120249 Salt: 64 cd af 2c 9b 07 0b d0 28 0c 02 6a 9c e5 d0 ea 35 f5 75 19 8e 13 e4 97 cb 73 5c d2 7b c5 db 3e Digest: 99 f4 be 55 ff 31 c8 ef b2 92 0f 07 4a 06 57 60 ce 1c 9c 81 0d ef 2a 25 08 c2 7f 8c 7f fb 9f b1 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Friday 17 April 2026 20:23:25 -0400 (0:00:01.453) 0:14:13.894 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Friday 17 April 2026 20:23:25 -0400 (0:00:00.313) 0:14:14.208 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Friday 17 April 2026 20:23:25 -0400 (0:00:00.306) 0:14:14.514 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Friday 17 April 2026 20:23:26 -0400 (0:00:00.220) 0:14:14.735 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Friday 17 April 2026 20:23:26 -0400 (0:00:00.239) 0:14:14.974 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Friday 17 April 2026 20:23:26 -0400 (0:00:00.235) 0:14:15.209 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Friday 17 April 2026 20:23:26 -0400 (0:00:00.219) 0:14:15.429 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Friday 17 April 2026 20:23:26 -0400 (0:00:00.225) 0:14:15.654 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-7a5c617a-1200-4191-9727-a5bffa31fbb7 /dev/sda1 /tmp/storage_testa62n2quhlukskey" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "/tmp/storage_testa62n2quhlukskey" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Friday 17 April 2026 20:23:27 -0400 (0:00:00.308) 0:14:15.963 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Friday 17 April 2026 20:23:27 -0400 (0:00:00.216) 0:14:16.180 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Friday 17 April 2026 20:23:27 -0400 (0:00:00.233) 0:14:16.413 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Friday 17 April 2026 20:23:27 -0400 (0:00:00.247) 0:14:16.661 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Friday 17 April 2026 20:23:28 -0400 (0:00:00.250) 0:14:16.912 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Friday 17 April 2026 20:23:28 -0400 (0:00:00.181) 0:14:17.094 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Friday 17 April 2026 20:23:28 -0400 (0:00:00.207) 0:14:17.301 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Friday 17 April 2026 20:23:28 -0400 (0:00:00.196) 0:14:17.497 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Friday 17 April 2026 20:23:29 -0400 (0:00:00.300) 0:14:17.798 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Friday 17 April 2026 20:23:29 -0400 (0:00:00.267) 0:14:18.065 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Friday 17 April 2026 20:23:29 -0400 (0:00:00.267) 0:14:18.333 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Friday 17 April 2026 20:23:29 -0400 (0:00:00.330) 0:14:18.663 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Friday 17 April 2026 20:23:30 -0400 (0:00:00.240) 0:14:18.904 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Friday 17 April 2026 20:23:30 -0400 (0:00:00.233) 0:14:19.137 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Friday 17 April 2026 20:23:30 -0400 (0:00:00.215) 0:14:19.353 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Friday 17 April 2026 20:23:30 -0400 (0:00:00.170) 0:14:19.523 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Friday 17 April 2026 20:23:31 -0400 (0:00:00.242) 0:14:19.765 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Friday 17 April 2026 20:23:31 -0400 (0:00:00.208) 0:14:19.974 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Friday 17 April 2026 20:23:31 -0400 (0:00:00.217) 0:14:20.192 ********** ok: [managed-node16] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Friday 17 April 2026 20:23:31 -0400 (0:00:00.173) 0:14:20.366 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Friday 17 April 2026 20:23:31 -0400 (0:00:00.262) 0:14:20.628 ********** skipping: [managed-node16] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Friday 17 April 2026 20:23:32 -0400 (0:00:00.255) 0:14:20.884 ********** skipping: [managed-node16] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Friday 17 April 2026 20:23:32 -0400 (0:00:00.215) 0:14:21.099 ********** skipping: [managed-node16] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Friday 17 April 2026 20:23:32 -0400 (0:00:00.239) 0:14:21.339 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Friday 17 April 2026 20:23:32 -0400 (0:00:00.236) 0:14:21.575 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Friday 17 April 2026 20:23:33 -0400 (0:00:00.244) 0:14:21.820 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Friday 17 April 2026 20:23:33 -0400 (0:00:00.329) 0:14:22.149 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Friday 17 April 2026 20:23:33 -0400 (0:00:00.290) 0:14:22.440 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Friday 17 April 2026 20:23:33 -0400 (0:00:00.278) 0:14:22.718 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Friday 17 April 2026 20:23:34 -0400 (0:00:00.214) 0:14:22.933 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Friday 17 April 2026 20:23:34 -0400 (0:00:00.185) 0:14:23.119 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Friday 17 April 2026 20:23:34 -0400 (0:00:00.256) 0:14:23.376 ********** skipping: [managed-node16] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Friday 17 April 2026 20:23:34 -0400 (0:00:00.229) 0:14:23.606 ********** skipping: [managed-node16] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Friday 17 April 2026 20:23:35 -0400 (0:00:00.266) 0:14:23.872 ********** skipping: [managed-node16] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Friday 17 April 2026 20:23:35 -0400 (0:00:00.265) 0:14:24.138 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Friday 17 April 2026 20:23:35 -0400 (0:00:00.259) 0:14:24.397 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Friday 17 April 2026 20:23:35 -0400 (0:00:00.277) 0:14:24.675 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Friday 17 April 2026 20:23:36 -0400 (0:00:00.224) 0:14:24.900 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Friday 17 April 2026 20:23:36 -0400 (0:00:00.334) 0:14:25.234 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Friday 17 April 2026 20:23:36 -0400 (0:00:00.234) 0:14:25.468 ********** ok: [managed-node16] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Friday 17 April 2026 20:23:36 -0400 (0:00:00.224) 0:14:25.693 ********** ok: [managed-node16] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Friday 17 April 2026 20:23:37 -0400 (0:00:00.222) 0:14:25.916 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Friday 17 April 2026 20:23:37 -0400 (0:00:00.264) 0:14:26.180 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Friday 17 April 2026 20:23:37 -0400 (0:00:00.185) 0:14:26.366 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Friday 17 April 2026 20:23:37 -0400 (0:00:00.247) 0:14:26.613 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Friday 17 April 2026 20:23:38 -0400 (0:00:00.266) 0:14:26.880 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Friday 17 April 2026 20:23:38 -0400 (0:00:00.223) 0:14:27.103 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Friday 17 April 2026 20:23:38 -0400 (0:00:00.270) 0:14:27.374 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Friday 17 April 2026 20:23:38 -0400 (0:00:00.316) 0:14:27.690 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Friday 17 April 2026 20:23:39 -0400 (0:00:00.382) 0:14:28.072 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Friday 17 April 2026 20:23:39 -0400 (0:00:00.195) 0:14:28.267 ********** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Friday 17 April 2026 20:23:39 -0400 (0:00:00.184) 0:14:28.452 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Remove the key file] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:342 Friday 17 April 2026 20:23:39 -0400 (0:00:00.187) 0:14:28.639 ********** ok: [managed-node16] => { "changed": false, "path": "/tmp/storage_testa62n2quhlukskey", "state": "absent" } TASK [Test for correct handling of new encrypted volume w/ no key - 3] ********* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:352 Friday 17 April 2026 20:23:41 -0400 (0:00:01.472) 0:14:30.112 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node16 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Friday 17 April 2026 20:23:41 -0400 (0:00:00.286) 0:14:30.398 ********** ok: [managed-node16] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Friday 17 April 2026 20:23:41 -0400 (0:00:00.273) 0:14:30.672 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node16 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Friday 17 April 2026 20:23:42 -0400 (0:00:00.255) 0:14:30.928 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Friday 17 April 2026 20:23:42 -0400 (0:00:00.232) 0:14:31.160 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 April 2026 20:23:42 -0400 (0:00:00.212) 0:14:31.373 ********** ok: [managed-node16] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Friday 17 April 2026 20:23:44 -0400 (0:00:01.815) 0:14:33.189 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 April 2026 20:23:44 -0400 (0:00:00.340) 0:14:33.529 ********** ok: [managed-node16] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 April 2026 20:23:46 -0400 (0:00:01.737) 0:14:35.267 ********** skipping: [managed-node16] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node16] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 April 2026 20:23:46 -0400 (0:00:00.377) 0:14:35.644 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 April 2026 20:23:47 -0400 (0:00:00.263) 0:14:35.908 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 April 2026 20:23:47 -0400 (0:00:00.213) 0:14:36.121 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 April 2026 20:23:47 -0400 (0:00:00.264) 0:14:36.386 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Friday 17 April 2026 20:23:47 -0400 (0:00:00.244) 0:14:36.630 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 April 2026 20:23:48 -0400 (0:00:00.529) 0:14:37.161 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Friday 17 April 2026 20:23:48 -0400 (0:00:00.312) 0:14:37.473 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Friday 17 April 2026 20:23:48 -0400 (0:00:00.145) 0:14:37.619 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Friday 17 April 2026 20:23:53 -0400 (0:00:04.159) 0:14:41.778 ********** ok: [managed-node16] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 17 April 2026 20:23:53 -0400 (0:00:00.159) 0:14:41.938 ********** ok: [managed-node16] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 April 2026 20:23:53 -0400 (0:00:00.139) 0:14:42.077 ********** ok: [managed-node16] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Friday 17 April 2026 20:23:58 -0400 (0:00:05.088) 0:14:47.165 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 April 2026 20:23:58 -0400 (0:00:00.262) 0:14:47.428 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 April 2026 20:23:58 -0400 (0:00:00.090) 0:14:47.519 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 April 2026 20:23:58 -0400 (0:00:00.152) 0:14:47.672 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Friday 17 April 2026 20:23:59 -0400 (0:00:00.170) 0:14:47.842 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Friday 17 April 2026 20:24:03 -0400 (0:00:03.899) 0:14:51.741 ********** ok: [managed-node16] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Friday 17 April 2026 20:24:05 -0400 (0:00:02.437) 0:14:54.179 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Friday 17 April 2026 20:24:05 -0400 (0:00:00.322) 0:14:54.501 ********** fatal: [managed-node16]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'test1' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Friday 17 April 2026 20:24:11 -0400 (0:00:05.263) 0:14:59.765 ********** fatal: [managed-node16]: FAILED! => { "changed": false } MSG: {'msg': "encrypted volume 'test1' missing key/password", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': False, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Friday 17 April 2026 20:24:11 -0400 (0:00:00.227) 0:14:59.992 ********** TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Friday 17 April 2026 20:24:11 -0400 (0:00:00.306) 0:15:00.299 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Friday 17 April 2026 20:24:11 -0400 (0:00:00.212) 0:15:00.512 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Friday 17 April 2026 20:24:12 -0400 (0:00:00.323) 0:15:00.836 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create an encrypted lvm volume w/ default fs] **************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:370 Friday 17 April 2026 20:24:12 -0400 (0:00:00.154) 0:15:00.990 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node16 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Friday 17 April 2026 20:24:12 -0400 (0:00:00.157) 0:15:01.148 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Friday 17 April 2026 20:24:12 -0400 (0:00:00.254) 0:15:01.402 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 April 2026 20:24:12 -0400 (0:00:00.142) 0:15:01.545 ********** ok: [managed-node16] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Friday 17 April 2026 20:24:14 -0400 (0:00:01.753) 0:15:03.299 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 April 2026 20:24:14 -0400 (0:00:00.187) 0:15:03.487 ********** ok: [managed-node16] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 April 2026 20:24:16 -0400 (0:00:01.445) 0:15:04.933 ********** skipping: [managed-node16] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node16] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 April 2026 20:24:16 -0400 (0:00:00.161) 0:15:05.094 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 April 2026 20:24:16 -0400 (0:00:00.175) 0:15:05.269 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 April 2026 20:24:16 -0400 (0:00:00.148) 0:15:05.417 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 April 2026 20:24:16 -0400 (0:00:00.084) 0:15:05.502 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Friday 17 April 2026 20:24:16 -0400 (0:00:00.096) 0:15:05.599 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 April 2026 20:24:17 -0400 (0:00:00.470) 0:15:06.069 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Friday 17 April 2026 20:24:17 -0400 (0:00:00.351) 0:15:06.421 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Friday 17 April 2026 20:24:17 -0400 (0:00:00.302) 0:15:06.724 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Friday 17 April 2026 20:24:21 -0400 (0:00:03.959) 0:15:10.683 ********** ok: [managed-node16] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 17 April 2026 20:24:22 -0400 (0:00:00.187) 0:15:10.870 ********** ok: [managed-node16] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 April 2026 20:24:22 -0400 (0:00:00.138) 0:15:11.009 ********** ok: [managed-node16] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Friday 17 April 2026 20:24:27 -0400 (0:00:05.412) 0:15:16.422 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 April 2026 20:24:28 -0400 (0:00:00.392) 0:15:16.814 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 April 2026 20:24:28 -0400 (0:00:00.154) 0:15:16.968 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 April 2026 20:24:28 -0400 (0:00:00.228) 0:15:17.197 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Friday 17 April 2026 20:24:28 -0400 (0:00:00.173) 0:15:17.370 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Friday 17 April 2026 20:24:32 -0400 (0:00:04.291) 0:15:21.662 ********** ok: [managed-node16] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Friday 17 April 2026 20:24:35 -0400 (0:00:02.784) 0:15:24.447 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Friday 17 April 2026 20:24:36 -0400 (0:00:00.417) 0:15:24.864 ********** changed: [managed-node16] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-7a5c617a-1200-4191-9727-a5bffa31fbb7", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-7a5c617a-1200-4191-9727-a5bffa31fbb7", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-30ec6405-b599-4e78-a6b8-ead3f359c08c", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-30ec6405-b599-4e78-a6b8-ead3f359c08c", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-7a5c617a-1200-4191-9727-a5bffa31fbb7", "password": "-", "state": "absent" }, { "backing_device": "/dev/mapper/foo-test1", "name": "luks-30ec6405-b599-4e78-a6b8-ead3f359c08c", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-30ec6405-b599-4e78-a6b8-ead3f359c08c" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-7a5c617a-1200-4191-9727-a5bffa31fbb7", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-30ec6405-b599-4e78-a6b8-ead3f359c08c", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-30ec6405-b599-4e78-a6b8-ead3f359c08c", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-30ec6405-b599-4e78-a6b8-ead3f359c08c", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Friday 17 April 2026 20:24:47 -0400 (0:00:11.746) 0:15:36.610 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Friday 17 April 2026 20:24:48 -0400 (0:00:00.284) 0:15:36.895 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776471763.1182792, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "d008cf511f836752604340983d11afa696e122da", "ctime": 1776471763.1152792, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 419430537, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776471763.1152792, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "571036750", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Friday 17 April 2026 20:24:49 -0400 (0:00:01.588) 0:15:38.483 ********** ok: [managed-node16] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Friday 17 April 2026 20:24:51 -0400 (0:00:01.317) 0:15:39.801 ********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Friday 17 April 2026 20:24:51 -0400 (0:00:00.376) 0:15:40.177 ********** ok: [managed-node16] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-7a5c617a-1200-4191-9727-a5bffa31fbb7", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-7a5c617a-1200-4191-9727-a5bffa31fbb7", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-30ec6405-b599-4e78-a6b8-ead3f359c08c", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-30ec6405-b599-4e78-a6b8-ead3f359c08c", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-7a5c617a-1200-4191-9727-a5bffa31fbb7", "password": "-", "state": "absent" }, { "backing_device": "/dev/mapper/foo-test1", "name": "luks-30ec6405-b599-4e78-a6b8-ead3f359c08c", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-30ec6405-b599-4e78-a6b8-ead3f359c08c" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-7a5c617a-1200-4191-9727-a5bffa31fbb7", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-30ec6405-b599-4e78-a6b8-ead3f359c08c", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-30ec6405-b599-4e78-a6b8-ead3f359c08c", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-30ec6405-b599-4e78-a6b8-ead3f359c08c", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Friday 17 April 2026 20:24:51 -0400 (0:00:00.311) 0:15:40.488 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-30ec6405-b599-4e78-a6b8-ead3f359c08c", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-30ec6405-b599-4e78-a6b8-ead3f359c08c", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Friday 17 April 2026 20:24:52 -0400 (0:00:00.378) 0:15:40.867 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Friday 17 April 2026 20:24:52 -0400 (0:00:00.362) 0:15:41.229 ********** changed: [managed-node16] => (item={'src': '/dev/mapper/luks-7a5c617a-1200-4191-9727-a5bffa31fbb7', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-7a5c617a-1200-4191-9727-a5bffa31fbb7", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-7a5c617a-1200-4191-9727-a5bffa31fbb7" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Friday 17 April 2026 20:24:54 -0400 (0:00:01.616) 0:15:42.845 ********** ok: [managed-node16] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Friday 17 April 2026 20:24:55 -0400 (0:00:01.870) 0:15:44.716 ********** changed: [managed-node16] => (item={'src': '/dev/mapper/luks-30ec6405-b599-4e78-a6b8-ead3f359c08c', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-30ec6405-b599-4e78-a6b8-ead3f359c08c", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-30ec6405-b599-4e78-a6b8-ead3f359c08c" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Friday 17 April 2026 20:24:57 -0400 (0:00:01.401) 0:15:46.118 ********** skipping: [managed-node16] => (item={'src': '/dev/mapper/luks-30ec6405-b599-4e78-a6b8-ead3f359c08c', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-30ec6405-b599-4e78-a6b8-ead3f359c08c", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Friday 17 April 2026 20:24:57 -0400 (0:00:00.331) 0:15:46.449 ********** ok: [managed-node16] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Friday 17 April 2026 20:24:59 -0400 (0:00:01.849) 0:15:48.298 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776471774.5733137, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "ecee078a11df5b8be8db739fdcccb0d5e72f9db2", "ctime": 1776471767.6352928, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 132120778, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1776471767.6332927, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 85, "uid": 0, "version": "3791812545", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Friday 17 April 2026 20:25:01 -0400 (0:00:01.752) 0:15:50.051 ********** changed: [managed-node16] => (item={'backing_device': '/dev/sda1', 'name': 'luks-7a5c617a-1200-4191-9727-a5bffa31fbb7', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-7a5c617a-1200-4191-9727-a5bffa31fbb7", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed changed: [managed-node16] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-30ec6405-b599-4e78-a6b8-ead3f359c08c', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-30ec6405-b599-4e78-a6b8-ead3f359c08c", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Friday 17 April 2026 20:25:04 -0400 (0:00:03.059) 0:15:53.111 ********** ok: [managed-node16] TASK [Verify role results - 7] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:388 Friday 17 April 2026 20:25:06 -0400 (0:00:02.157) 0:15:55.268 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node16 TASK [Print out pool information] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Friday 17 April 2026 20:25:06 -0400 (0:00:00.341) 0:15:55.609 ********** ok: [managed-node16] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-30ec6405-b599-4e78-a6b8-ead3f359c08c", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-30ec6405-b599-4e78-a6b8-ead3f359c08c", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Friday 17 April 2026 20:25:07 -0400 (0:00:00.225) 0:15:55.835 ********** skipping: [managed-node16] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Friday 17 April 2026 20:25:07 -0400 (0:00:00.261) 0:15:56.096 ********** ok: [managed-node16] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "30ec6405-b599-4e78-a6b8-ead3f359c08c" }, "/dev/mapper/luks-30ec6405-b599-4e78-a6b8-ead3f359c08c": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-30ec6405-b599-4e78-a6b8-ead3f359c08c", "size": "4G", "type": "crypt", "uuid": "bfc36714-43ff-494e-b506-48e5ae1691f9" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "MXMlX8-1TPL-giWv-f9GY-miKG-0a91-sT0DvI" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Friday 17 April 2026 20:25:09 -0400 (0:00:01.668) 0:15:57.764 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002483", "end": "2026-04-17 20:25:10.447966", "rc": 0, "start": "2026-04-17 20:25:10.445483" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-30ec6405-b599-4e78-a6b8-ead3f359c08c /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Friday 17 April 2026 20:25:10 -0400 (0:00:01.781) 0:15:59.546 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002447", "end": "2026-04-17 20:25:12.131822", "failed_when_result": false, "rc": 0, "start": "2026-04-17 20:25:12.129375" } STDOUT: luks-30ec6405-b599-4e78-a6b8-ead3f359c08c /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Friday 17 April 2026 20:25:12 -0400 (0:00:01.583) 0:16:01.130 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node16 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Friday 17 April 2026 20:25:12 -0400 (0:00:00.320) 0:16:01.450 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Friday 17 April 2026 20:25:12 -0400 (0:00:00.189) 0:16:01.639 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.021987", "end": "2026-04-17 20:25:14.186871", "rc": 0, "start": "2026-04-17 20:25:14.164884" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Friday 17 April 2026 20:25:14 -0400 (0:00:01.599) 0:16:03.239 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Friday 17 April 2026 20:25:14 -0400 (0:00:00.453) 0:16:03.693 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node16 TASK [Set test variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Friday 17 April 2026 20:25:15 -0400 (0:00:00.599) 0:16:04.292 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Friday 17 April 2026 20:25:15 -0400 (0:00:00.370) 0:16:04.663 ********** ok: [managed-node16] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Friday 17 April 2026 20:25:19 -0400 (0:00:03.960) 0:16:08.624 ********** ok: [managed-node16] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Friday 17 April 2026 20:25:20 -0400 (0:00:00.299) 0:16:08.924 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Friday 17 April 2026 20:25:20 -0400 (0:00:00.308) 0:16:09.232 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Friday 17 April 2026 20:25:20 -0400 (0:00:00.403) 0:16:09.636 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Friday 17 April 2026 20:25:21 -0400 (0:00:00.227) 0:16:09.864 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Friday 17 April 2026 20:25:21 -0400 (0:00:00.366) 0:16:10.230 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Friday 17 April 2026 20:25:21 -0400 (0:00:00.351) 0:16:10.582 ********** ok: [managed-node16] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Friday 17 April 2026 20:25:22 -0400 (0:00:00.476) 0:16:11.058 ********** ok: [managed-node16] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.9.63 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Friday 17 April 2026 20:25:23 -0400 (0:00:01.579) 0:16:12.637 ********** skipping: [managed-node16] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Friday 17 April 2026 20:25:24 -0400 (0:00:00.292) 0:16:12.930 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node16 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Friday 17 April 2026 20:25:24 -0400 (0:00:00.629) 0:16:13.559 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Friday 17 April 2026 20:25:25 -0400 (0:00:00.217) 0:16:13.777 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Friday 17 April 2026 20:25:25 -0400 (0:00:00.267) 0:16:14.044 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Friday 17 April 2026 20:25:25 -0400 (0:00:00.317) 0:16:14.362 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Friday 17 April 2026 20:25:25 -0400 (0:00:00.249) 0:16:14.611 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Friday 17 April 2026 20:25:26 -0400 (0:00:00.235) 0:16:14.846 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Friday 17 April 2026 20:25:26 -0400 (0:00:00.286) 0:16:15.133 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Friday 17 April 2026 20:25:26 -0400 (0:00:00.285) 0:16:15.419 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Friday 17 April 2026 20:25:26 -0400 (0:00:00.283) 0:16:15.702 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Friday 17 April 2026 20:25:27 -0400 (0:00:00.255) 0:16:15.958 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Friday 17 April 2026 20:25:27 -0400 (0:00:00.314) 0:16:16.272 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Friday 17 April 2026 20:25:27 -0400 (0:00:00.308) 0:16:16.581 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node16 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Friday 17 April 2026 20:25:28 -0400 (0:00:00.560) 0:16:17.141 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node16 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Friday 17 April 2026 20:25:28 -0400 (0:00:00.382) 0:16:17.523 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Friday 17 April 2026 20:25:29 -0400 (0:00:00.246) 0:16:17.770 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Friday 17 April 2026 20:25:29 -0400 (0:00:00.296) 0:16:18.066 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Friday 17 April 2026 20:25:29 -0400 (0:00:00.317) 0:16:18.384 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Friday 17 April 2026 20:25:29 -0400 (0:00:00.266) 0:16:18.651 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Friday 17 April 2026 20:25:30 -0400 (0:00:00.343) 0:16:18.994 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Friday 17 April 2026 20:25:31 -0400 (0:00:01.307) 0:16:20.302 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Friday 17 April 2026 20:25:31 -0400 (0:00:00.294) 0:16:20.596 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node16 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Friday 17 April 2026 20:25:32 -0400 (0:00:00.489) 0:16:21.086 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node16 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Friday 17 April 2026 20:25:32 -0400 (0:00:00.367) 0:16:21.453 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Friday 17 April 2026 20:25:32 -0400 (0:00:00.237) 0:16:21.691 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Friday 17 April 2026 20:25:33 -0400 (0:00:00.277) 0:16:21.969 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Friday 17 April 2026 20:25:33 -0400 (0:00:00.271) 0:16:22.241 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Friday 17 April 2026 20:25:33 -0400 (0:00:00.228) 0:16:22.469 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node16 TASK [Set test variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Friday 17 April 2026 20:25:34 -0400 (0:00:00.620) 0:16:23.089 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Friday 17 April 2026 20:25:34 -0400 (0:00:00.205) 0:16:23.294 ********** skipping: [managed-node16] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Friday 17 April 2026 20:25:34 -0400 (0:00:00.306) 0:16:23.601 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node16 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Friday 17 April 2026 20:25:35 -0400 (0:00:00.394) 0:16:23.995 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Friday 17 April 2026 20:25:35 -0400 (0:00:00.229) 0:16:24.224 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Friday 17 April 2026 20:25:35 -0400 (0:00:00.312) 0:16:24.536 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Friday 17 April 2026 20:25:35 -0400 (0:00:00.170) 0:16:24.707 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Friday 17 April 2026 20:25:36 -0400 (0:00:00.205) 0:16:24.912 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Friday 17 April 2026 20:25:36 -0400 (0:00:00.181) 0:16:25.093 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Friday 17 April 2026 20:25:36 -0400 (0:00:00.211) 0:16:25.304 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Friday 17 April 2026 20:25:36 -0400 (0:00:00.184) 0:16:25.489 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node16 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Friday 17 April 2026 20:25:37 -0400 (0:00:00.378) 0:16:25.868 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node16 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Friday 17 April 2026 20:25:37 -0400 (0:00:00.387) 0:16:26.255 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Friday 17 April 2026 20:25:37 -0400 (0:00:00.276) 0:16:26.532 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Friday 17 April 2026 20:25:37 -0400 (0:00:00.183) 0:16:26.716 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Friday 17 April 2026 20:25:38 -0400 (0:00:00.166) 0:16:26.882 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off - 2] *********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Friday 17 April 2026 20:25:38 -0400 (0:00:00.295) 0:16:27.177 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on - 2] ************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Friday 17 April 2026 20:25:38 -0400 (0:00:00.242) 0:16:27.420 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Friday 17 April 2026 20:25:39 -0400 (0:00:00.307) 0:16:27.727 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Friday 17 April 2026 20:25:39 -0400 (0:00:00.234) 0:16:27.962 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node16 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Friday 17 April 2026 20:25:39 -0400 (0:00:00.543) 0:16:28.505 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Friday 17 April 2026 20:25:40 -0400 (0:00:00.308) 0:16:28.814 ********** skipping: [managed-node16] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Friday 17 April 2026 20:25:40 -0400 (0:00:00.221) 0:16:29.035 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Friday 17 April 2026 20:25:40 -0400 (0:00:00.236) 0:16:29.272 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Friday 17 April 2026 20:25:40 -0400 (0:00:00.289) 0:16:29.561 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Friday 17 April 2026 20:25:41 -0400 (0:00:00.228) 0:16:29.790 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Friday 17 April 2026 20:25:41 -0400 (0:00:00.286) 0:16:30.076 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Friday 17 April 2026 20:25:41 -0400 (0:00:00.217) 0:16:30.294 ********** ok: [managed-node16] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Friday 17 April 2026 20:25:41 -0400 (0:00:00.217) 0:16:30.511 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node16 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Friday 17 April 2026 20:25:42 -0400 (0:00:00.358) 0:16:30.871 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Friday 17 April 2026 20:25:42 -0400 (0:00:00.267) 0:16:31.138 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node16 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Friday 17 April 2026 20:25:43 -0400 (0:00:01.089) 0:16:32.228 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-30ec6405-b599-4e78-a6b8-ead3f359c08c" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Friday 17 April 2026 20:25:43 -0400 (0:00:00.203) 0:16:32.431 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Friday 17 April 2026 20:25:43 -0400 (0:00:00.213) 0:16:32.645 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Friday 17 April 2026 20:25:44 -0400 (0:00:00.275) 0:16:32.920 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Friday 17 April 2026 20:25:44 -0400 (0:00:00.257) 0:16:33.177 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Friday 17 April 2026 20:25:44 -0400 (0:00:00.336) 0:16:33.514 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Friday 17 April 2026 20:25:45 -0400 (0:00:00.265) 0:16:33.779 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Friday 17 April 2026 20:25:45 -0400 (0:00:00.222) 0:16:34.002 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Friday 17 April 2026 20:25:45 -0400 (0:00:00.275) 0:16:34.277 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Friday 17 April 2026 20:25:45 -0400 (0:00:00.281) 0:16:34.559 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Friday 17 April 2026 20:25:46 -0400 (0:00:00.279) 0:16:34.839 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Friday 17 April 2026 20:25:46 -0400 (0:00:00.123) 0:16:34.963 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-30ec6405-b599-4e78-a6b8-ead3f359c08c " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Friday 17 April 2026 20:25:46 -0400 (0:00:00.600) 0:16:35.563 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Friday 17 April 2026 20:25:47 -0400 (0:00:00.326) 0:16:35.889 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Friday 17 April 2026 20:25:47 -0400 (0:00:00.324) 0:16:36.214 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Friday 17 April 2026 20:25:47 -0400 (0:00:00.247) 0:16:36.461 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Friday 17 April 2026 20:25:47 -0400 (0:00:00.176) 0:16:36.637 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Friday 17 April 2026 20:25:48 -0400 (0:00:00.146) 0:16:36.784 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Friday 17 April 2026 20:25:48 -0400 (0:00:00.301) 0:16:37.085 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Friday 17 April 2026 20:25:48 -0400 (0:00:00.285) 0:16:37.371 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776471887.4096534, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776471887.4096534, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 268804, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776471887.4096534, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Friday 17 April 2026 20:25:50 -0400 (0:00:01.680) 0:16:39.051 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Friday 17 April 2026 20:25:50 -0400 (0:00:00.327) 0:16:39.378 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Friday 17 April 2026 20:25:50 -0400 (0:00:00.208) 0:16:39.587 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Friday 17 April 2026 20:25:51 -0400 (0:00:00.200) 0:16:39.788 ********** ok: [managed-node16] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Friday 17 April 2026 20:25:51 -0400 (0:00:00.342) 0:16:40.130 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Friday 17 April 2026 20:25:51 -0400 (0:00:00.355) 0:16:40.486 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Friday 17 April 2026 20:25:52 -0400 (0:00:00.246) 0:16:40.733 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776471887.5466537, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776471887.5466537, "dev": 6, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 267984, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776471887.5466537, "nlink": 1, "path": "/dev/mapper/luks-30ec6405-b599-4e78-a6b8-ead3f359c08c", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Friday 17 April 2026 20:25:53 -0400 (0:00:01.578) 0:16:42.311 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Friday 17 April 2026 20:25:57 -0400 (0:00:04.153) 0:16:46.465 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.009426", "end": "2026-04-17 20:25:58.855649", "rc": 0, "start": "2026-04-17 20:25:58.846223" } STDOUT: LUKS header information for /dev/mapper/foo-test1 Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 16384 MK bits: 512 MK digest: a1 78 fd 06 7c 5f 93 28 ab 7b ef da a7 10 08 c8 81 1a b2 c7 MK salt: 1d 8a 9b d6 8c fd 1d a4 d2 e2 dc 83 23 70 07 6c fa 69 4f e0 6c 90 55 87 01 08 01 a6 17 83 9f 45 MK iterations: 120249 UUID: 30ec6405-b599-4e78-a6b8-ead3f359c08c Key Slot 0: ENABLED Iterations: 1927528 Salt: a1 98 f5 d5 93 5b b3 00 72 4e 88 f3 de 8c 15 31 51 f7 34 bd 42 09 f6 09 1c 76 7f d4 e1 cf 8f 5b Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Friday 17 April 2026 20:25:59 -0400 (0:00:01.288) 0:16:47.753 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Friday 17 April 2026 20:25:59 -0400 (0:00:00.230) 0:16:47.984 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Friday 17 April 2026 20:25:59 -0400 (0:00:00.285) 0:16:48.270 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Friday 17 April 2026 20:25:59 -0400 (0:00:00.285) 0:16:48.555 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Friday 17 April 2026 20:26:00 -0400 (0:00:00.194) 0:16:48.750 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Friday 17 April 2026 20:26:00 -0400 (0:00:00.365) 0:16:49.116 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Friday 17 April 2026 20:26:00 -0400 (0:00:00.241) 0:16:49.357 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Set test variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Friday 17 April 2026 20:26:00 -0400 (0:00:00.266) 0:16:49.624 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-30ec6405-b599-4e78-a6b8-ead3f359c08c /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Friday 17 April 2026 20:26:01 -0400 (0:00:00.120) 0:16:49.744 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Friday 17 April 2026 20:26:01 -0400 (0:00:00.190) 0:16:49.935 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Friday 17 April 2026 20:26:01 -0400 (0:00:00.222) 0:16:50.158 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Friday 17 April 2026 20:26:01 -0400 (0:00:00.310) 0:16:50.469 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Friday 17 April 2026 20:26:02 -0400 (0:00:00.351) 0:16:50.820 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Friday 17 April 2026 20:26:02 -0400 (0:00:00.192) 0:16:51.012 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Friday 17 April 2026 20:26:02 -0400 (0:00:00.223) 0:16:51.235 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Friday 17 April 2026 20:26:02 -0400 (0:00:00.178) 0:16:51.414 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Friday 17 April 2026 20:26:02 -0400 (0:00:00.283) 0:16:51.697 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Friday 17 April 2026 20:26:03 -0400 (0:00:00.268) 0:16:51.966 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Friday 17 April 2026 20:26:03 -0400 (0:00:00.255) 0:16:52.222 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Friday 17 April 2026 20:26:03 -0400 (0:00:00.256) 0:16:52.478 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Friday 17 April 2026 20:26:04 -0400 (0:00:00.291) 0:16:52.770 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Friday 17 April 2026 20:26:04 -0400 (0:00:00.287) 0:16:53.057 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Friday 17 April 2026 20:26:04 -0400 (0:00:00.385) 0:16:53.442 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Friday 17 April 2026 20:26:04 -0400 (0:00:00.261) 0:16:53.703 ********** ok: [managed-node16] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Friday 17 April 2026 20:26:08 -0400 (0:00:03.766) 0:16:57.470 ********** ok: [managed-node16] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Friday 17 April 2026 20:26:10 -0400 (0:00:01.501) 0:16:58.971 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Friday 17 April 2026 20:26:10 -0400 (0:00:00.411) 0:16:59.383 ********** ok: [managed-node16] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Friday 17 April 2026 20:26:11 -0400 (0:00:00.383) 0:16:59.767 ********** ok: [managed-node16] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Friday 17 April 2026 20:26:12 -0400 (0:00:01.668) 0:17:01.435 ********** skipping: [managed-node16] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Friday 17 April 2026 20:26:12 -0400 (0:00:00.285) 0:17:01.720 ********** skipping: [managed-node16] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Friday 17 April 2026 20:26:13 -0400 (0:00:00.323) 0:17:02.044 ********** skipping: [managed-node16] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Friday 17 April 2026 20:26:13 -0400 (0:00:00.319) 0:17:02.364 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Friday 17 April 2026 20:26:13 -0400 (0:00:00.329) 0:17:02.693 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Friday 17 April 2026 20:26:14 -0400 (0:00:00.278) 0:17:02.972 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Friday 17 April 2026 20:26:14 -0400 (0:00:00.296) 0:17:03.268 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Friday 17 April 2026 20:26:14 -0400 (0:00:00.374) 0:17:03.643 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Friday 17 April 2026 20:26:15 -0400 (0:00:00.359) 0:17:04.002 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Friday 17 April 2026 20:26:15 -0400 (0:00:00.259) 0:17:04.262 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Friday 17 April 2026 20:26:15 -0400 (0:00:00.140) 0:17:04.402 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Friday 17 April 2026 20:26:15 -0400 (0:00:00.199) 0:17:04.602 ********** skipping: [managed-node16] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Friday 17 April 2026 20:26:16 -0400 (0:00:00.194) 0:17:04.796 ********** skipping: [managed-node16] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Friday 17 April 2026 20:26:16 -0400 (0:00:00.167) 0:17:04.964 ********** skipping: [managed-node16] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Friday 17 April 2026 20:26:16 -0400 (0:00:00.156) 0:17:05.120 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Friday 17 April 2026 20:26:16 -0400 (0:00:00.128) 0:17:05.249 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Friday 17 April 2026 20:26:16 -0400 (0:00:00.219) 0:17:05.468 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Friday 17 April 2026 20:26:16 -0400 (0:00:00.114) 0:17:05.582 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Friday 17 April 2026 20:26:16 -0400 (0:00:00.139) 0:17:05.722 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Friday 17 April 2026 20:26:17 -0400 (0:00:00.181) 0:17:05.904 ********** ok: [managed-node16] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Friday 17 April 2026 20:26:17 -0400 (0:00:00.186) 0:17:06.090 ********** ok: [managed-node16] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Friday 17 April 2026 20:26:17 -0400 (0:00:00.276) 0:17:06.366 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Friday 17 April 2026 20:26:17 -0400 (0:00:00.302) 0:17:06.669 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.023666", "end": "2026-04-17 20:26:19.298896", "rc": 0, "start": "2026-04-17 20:26:19.275230" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Friday 17 April 2026 20:26:19 -0400 (0:00:01.690) 0:17:08.359 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Friday 17 April 2026 20:26:19 -0400 (0:00:00.251) 0:17:08.610 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Friday 17 April 2026 20:26:20 -0400 (0:00:00.267) 0:17:08.878 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Friday 17 April 2026 20:26:20 -0400 (0:00:00.285) 0:17:09.164 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Friday 17 April 2026 20:26:20 -0400 (0:00:00.306) 0:17:09.470 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Friday 17 April 2026 20:26:21 -0400 (0:00:00.297) 0:17:09.767 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Friday 17 April 2026 20:26:21 -0400 (0:00:00.278) 0:17:10.046 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Friday 17 April 2026 20:26:21 -0400 (0:00:00.228) 0:17:10.274 ********** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Friday 17 April 2026 20:26:21 -0400 (0:00:00.164) 0:17:10.439 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Verify preservation of encryption settings on existing LVM volume] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:391 Friday 17 April 2026 20:26:21 -0400 (0:00:00.254) 0:17:10.694 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node16 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Friday 17 April 2026 20:26:22 -0400 (0:00:00.312) 0:17:11.007 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Friday 17 April 2026 20:26:22 -0400 (0:00:00.314) 0:17:11.321 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 April 2026 20:26:22 -0400 (0:00:00.303) 0:17:11.625 ********** ok: [managed-node16] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Friday 17 April 2026 20:26:24 -0400 (0:00:01.677) 0:17:13.302 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 April 2026 20:26:24 -0400 (0:00:00.248) 0:17:13.551 ********** ok: [managed-node16] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 April 2026 20:26:26 -0400 (0:00:02.166) 0:17:15.718 ********** skipping: [managed-node16] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node16] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 April 2026 20:26:27 -0400 (0:00:00.442) 0:17:16.160 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 April 2026 20:26:27 -0400 (0:00:00.201) 0:17:16.362 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 April 2026 20:26:27 -0400 (0:00:00.242) 0:17:16.605 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 April 2026 20:26:28 -0400 (0:00:00.200) 0:17:16.805 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Friday 17 April 2026 20:26:28 -0400 (0:00:00.276) 0:17:17.082 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 April 2026 20:26:28 -0400 (0:00:00.411) 0:17:17.494 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Friday 17 April 2026 20:26:29 -0400 (0:00:00.303) 0:17:17.797 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Friday 17 April 2026 20:26:29 -0400 (0:00:00.257) 0:17:18.054 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Friday 17 April 2026 20:26:33 -0400 (0:00:04.213) 0:17:22.268 ********** ok: [managed-node16] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 17 April 2026 20:26:33 -0400 (0:00:00.265) 0:17:22.533 ********** ok: [managed-node16] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 April 2026 20:26:34 -0400 (0:00:00.300) 0:17:22.834 ********** ok: [managed-node16] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Friday 17 April 2026 20:26:39 -0400 (0:00:05.656) 0:17:28.491 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 April 2026 20:26:40 -0400 (0:00:00.340) 0:17:28.831 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 April 2026 20:26:40 -0400 (0:00:00.216) 0:17:29.048 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 April 2026 20:26:40 -0400 (0:00:00.151) 0:17:29.199 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Friday 17 April 2026 20:26:40 -0400 (0:00:00.195) 0:17:29.394 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Friday 17 April 2026 20:26:44 -0400 (0:00:04.227) 0:17:33.622 ********** ok: [managed-node16] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...d1200\\x2d4191\\x2d9727\\x2da5bffa31fbb7.service": { "name": "systemd-cryptsetup@luk...d1200\\x2d4191\\x2d9727\\x2da5bffa31fbb7.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2d7a5c617a\\x2d1200\\x2d4191\\x2d9727\\x2da5bffa31fbb7.service": { "name": "systemd-cryptsetup@luks\\x2d7a5c617a\\x2d1200\\x2d4191\\x2d9727\\x2da5bffa31fbb7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Friday 17 April 2026 20:26:47 -0400 (0:00:02.638) 0:17:36.260 ********** changed: [managed-node16] => (item=systemd-cryptsetup@luks\x2d7a5c617a\x2d1200\x2d4191\x2d9727\x2da5bffa31fbb7.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d7a5c617a\\x2d1200\\x2d4191\\x2d9727\\x2da5bffa31fbb7.service", "name": "systemd-cryptsetup@luks\\x2d7a5c617a\\x2d1200\\x2d4191\\x2d9727\\x2da5bffa31fbb7.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "cryptsetup-pre.target system-systemd\\x2dcryptsetup.slice -.mount systemd-journald.socket tmp.mount dev-sda1.device", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-7a5c617a-1200-4191-9727-a5bffa31fbb7", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-7a5c617a-1200-4191-9727-a5bffa31fbb7 /dev/sda1 /tmp/storage_testa62n2quhlukskey ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-7a5c617a-1200-4191-9727-a5bffa31fbb7 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d7a5c617a\\x2d1200\\x2d4191\\x2d9727\\x2da5bffa31fbb7.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d7a5c617a\\x2d1200\\x2d4191\\x2d9727\\x2da5bffa31fbb7.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d7a5c617a\\x2d1200\\x2d4191\\x2d9727\\x2da5bffa31fbb7.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "-.mount system-systemd\\x2dcryptsetup.slice", "RequiresMountsFor": "/tmp/storage_testa62n2quhlukskey", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Fri 2026-04-17 20:24:59 EDT", "StateChangeTimestampMonotonic": "2557243391", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node16] => (item=systemd-cryptsetup@luk...d1200\x2d4191\x2d9727\x2da5bffa31fbb7.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d1200\\x2d4191\\x2d9727\\x2da5bffa31fbb7.service", "name": "systemd-cryptsetup@luk...d1200\\x2d4191\\x2d9727\\x2da5bffa31fbb7.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d1200\\x2d4191\\x2d9727\\x2da5bffa31fbb7.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d1200\\x2d4191\\x2d9727\\x2da5bffa31fbb7.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...d1200\\x2d4191\\x2d9727\\x2da5bffa31fbb7.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d1200\\x2d4191\\x2d9727\\x2da5bffa31fbb7.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Friday 17 April 2026 20:26:49 -0400 (0:00:02.448) 0:17:38.709 ********** ok: [managed-node16] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-30ec6405-b599-4e78-a6b8-ead3f359c08c" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-30ec6405-b599-4e78-a6b8-ead3f359c08c", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-30ec6405-b599-4e78-a6b8-ead3f359c08c", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-30ec6405-b599-4e78-a6b8-ead3f359c08c", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Friday 17 April 2026 20:26:55 -0400 (0:00:05.027) 0:17:43.736 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Friday 17 April 2026 20:26:55 -0400 (0:00:00.182) 0:17:43.919 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776471897.1606827, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "0cfa1462ec86ed343d675e75474e5bed1baf7336", "ctime": 1776471897.1576827, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 419430537, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776471897.1576827, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "571036750", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Friday 17 April 2026 20:26:56 -0400 (0:00:01.622) 0:17:45.541 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Friday 17 April 2026 20:26:56 -0400 (0:00:00.167) 0:17:45.709 ********** changed: [managed-node16] => (item=systemd-cryptsetup@luks\x2d7a5c617a\x2d1200\x2d4191\x2d9727\x2da5bffa31fbb7.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d7a5c617a\\x2d1200\\x2d4191\\x2d9727\\x2da5bffa31fbb7.service", "name": "systemd-cryptsetup@luks\\x2d7a5c617a\\x2d1200\\x2d4191\\x2d9727\\x2da5bffa31fbb7.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d7a5c617a\\x2d1200\\x2d4191\\x2d9727\\x2da5bffa31fbb7.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d7a5c617a\\x2d1200\\x2d4191\\x2d9727\\x2da5bffa31fbb7.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d7a5c617a\\x2d1200\\x2d4191\\x2d9727\\x2da5bffa31fbb7.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d7a5c617a\\x2d1200\\x2d4191\\x2d9727\\x2da5bffa31fbb7.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node16] => (item=systemd-cryptsetup@luk...d1200\x2d4191\x2d9727\x2da5bffa31fbb7.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d1200\\x2d4191\\x2d9727\\x2da5bffa31fbb7.service", "name": "systemd-cryptsetup@luk...d1200\\x2d4191\\x2d9727\\x2da5bffa31fbb7.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d1200\\x2d4191\\x2d9727\\x2da5bffa31fbb7.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d1200\\x2d4191\\x2d9727\\x2da5bffa31fbb7.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...d1200\\x2d4191\\x2d9727\\x2da5bffa31fbb7.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d1200\\x2d4191\\x2d9727\\x2da5bffa31fbb7.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Friday 17 April 2026 20:27:00 -0400 (0:00:03.121) 0:17:48.831 ********** ok: [managed-node16] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-30ec6405-b599-4e78-a6b8-ead3f359c08c" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-30ec6405-b599-4e78-a6b8-ead3f359c08c", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-30ec6405-b599-4e78-a6b8-ead3f359c08c", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-30ec6405-b599-4e78-a6b8-ead3f359c08c", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Friday 17 April 2026 20:27:00 -0400 (0:00:00.170) 0:17:49.001 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-30ec6405-b599-4e78-a6b8-ead3f359c08c", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-30ec6405-b599-4e78-a6b8-ead3f359c08c", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Friday 17 April 2026 20:27:00 -0400 (0:00:00.151) 0:17:49.152 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Friday 17 April 2026 20:27:00 -0400 (0:00:00.168) 0:17:49.321 ********** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Friday 17 April 2026 20:27:00 -0400 (0:00:00.181) 0:17:49.502 ********** ok: [managed-node16] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Friday 17 April 2026 20:27:02 -0400 (0:00:01.600) 0:17:51.103 ********** ok: [managed-node16] => (item={'src': '/dev/mapper/luks-30ec6405-b599-4e78-a6b8-ead3f359c08c', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-30ec6405-b599-4e78-a6b8-ead3f359c08c", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-30ec6405-b599-4e78-a6b8-ead3f359c08c" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Friday 17 April 2026 20:27:03 -0400 (0:00:01.460) 0:17:52.563 ********** skipping: [managed-node16] => (item={'src': '/dev/mapper/luks-30ec6405-b599-4e78-a6b8-ead3f359c08c', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-30ec6405-b599-4e78-a6b8-ead3f359c08c", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Friday 17 April 2026 20:27:04 -0400 (0:00:00.220) 0:17:52.783 ********** ok: [managed-node16] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Friday 17 April 2026 20:27:05 -0400 (0:00:01.526) 0:17:54.310 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776471912.1297278, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "f16117cc80b0fc2fda7da96b15f88e44870128ac", "ctime": 1776471904.189704, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 270532806, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1776471904.189704, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "391408018", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Friday 17 April 2026 20:27:06 -0400 (0:00:00.928) 0:17:55.239 ********** TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Friday 17 April 2026 20:27:06 -0400 (0:00:00.158) 0:17:55.397 ********** ok: [managed-node16] TASK [Assert preservation of encryption settings on existing LVM volume] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:404 Friday 17 April 2026 20:27:08 -0400 (0:00:01.752) 0:17:57.150 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify role results - 8] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:411 Friday 17 April 2026 20:27:08 -0400 (0:00:00.200) 0:17:57.351 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node16 TASK [Print out pool information] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Friday 17 April 2026 20:27:09 -0400 (0:00:00.390) 0:17:57.742 ********** ok: [managed-node16] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-30ec6405-b599-4e78-a6b8-ead3f359c08c", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-30ec6405-b599-4e78-a6b8-ead3f359c08c", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Friday 17 April 2026 20:27:09 -0400 (0:00:00.249) 0:17:57.991 ********** skipping: [managed-node16] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Friday 17 April 2026 20:27:09 -0400 (0:00:00.213) 0:17:58.204 ********** ok: [managed-node16] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "30ec6405-b599-4e78-a6b8-ead3f359c08c" }, "/dev/mapper/luks-30ec6405-b599-4e78-a6b8-ead3f359c08c": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-30ec6405-b599-4e78-a6b8-ead3f359c08c", "size": "4G", "type": "crypt", "uuid": "bfc36714-43ff-494e-b506-48e5ae1691f9" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "MXMlX8-1TPL-giWv-f9GY-miKG-0a91-sT0DvI" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Friday 17 April 2026 20:27:10 -0400 (0:00:01.427) 0:17:59.632 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002580", "end": "2026-04-17 20:27:12.102671", "rc": 0, "start": "2026-04-17 20:27:12.100091" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-30ec6405-b599-4e78-a6b8-ead3f359c08c /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Friday 17 April 2026 20:27:12 -0400 (0:00:01.385) 0:18:01.017 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002712", "end": "2026-04-17 20:27:13.343133", "failed_when_result": false, "rc": 0, "start": "2026-04-17 20:27:13.340421" } STDOUT: luks-30ec6405-b599-4e78-a6b8-ead3f359c08c /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Friday 17 April 2026 20:27:13 -0400 (0:00:01.283) 0:18:02.301 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node16 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Friday 17 April 2026 20:27:13 -0400 (0:00:00.132) 0:18:02.433 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Friday 17 April 2026 20:27:13 -0400 (0:00:00.102) 0:18:02.536 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.023812", "end": "2026-04-17 20:27:14.475024", "rc": 0, "start": "2026-04-17 20:27:14.451212" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Friday 17 April 2026 20:27:14 -0400 (0:00:00.864) 0:18:03.401 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Friday 17 April 2026 20:27:14 -0400 (0:00:00.226) 0:18:03.627 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node16 TASK [Set test variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Friday 17 April 2026 20:27:15 -0400 (0:00:00.365) 0:18:03.993 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Friday 17 April 2026 20:27:16 -0400 (0:00:00.867) 0:18:04.861 ********** ok: [managed-node16] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Friday 17 April 2026 20:27:17 -0400 (0:00:01.227) 0:18:06.088 ********** ok: [managed-node16] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Friday 17 April 2026 20:27:17 -0400 (0:00:00.238) 0:18:06.326 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Friday 17 April 2026 20:27:17 -0400 (0:00:00.298) 0:18:06.625 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Friday 17 April 2026 20:27:18 -0400 (0:00:00.246) 0:18:06.871 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Friday 17 April 2026 20:27:18 -0400 (0:00:00.222) 0:18:07.093 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Friday 17 April 2026 20:27:18 -0400 (0:00:00.262) 0:18:07.356 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Friday 17 April 2026 20:27:18 -0400 (0:00:00.233) 0:18:07.590 ********** ok: [managed-node16] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Friday 17 April 2026 20:27:19 -0400 (0:00:00.264) 0:18:07.854 ********** ok: [managed-node16] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.9.63 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Friday 17 April 2026 20:27:20 -0400 (0:00:01.290) 0:18:09.144 ********** skipping: [managed-node16] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Friday 17 April 2026 20:27:20 -0400 (0:00:00.145) 0:18:09.290 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node16 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Friday 17 April 2026 20:27:21 -0400 (0:00:00.509) 0:18:09.799 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Friday 17 April 2026 20:27:21 -0400 (0:00:00.267) 0:18:10.067 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Friday 17 April 2026 20:27:21 -0400 (0:00:00.198) 0:18:10.265 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Friday 17 April 2026 20:27:21 -0400 (0:00:00.230) 0:18:10.496 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Friday 17 April 2026 20:27:21 -0400 (0:00:00.227) 0:18:10.724 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Friday 17 April 2026 20:27:22 -0400 (0:00:00.287) 0:18:11.011 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Friday 17 April 2026 20:27:22 -0400 (0:00:00.324) 0:18:11.335 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Friday 17 April 2026 20:27:22 -0400 (0:00:00.210) 0:18:11.546 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Friday 17 April 2026 20:27:22 -0400 (0:00:00.160) 0:18:11.707 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Friday 17 April 2026 20:27:23 -0400 (0:00:00.229) 0:18:11.936 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Friday 17 April 2026 20:27:23 -0400 (0:00:00.200) 0:18:12.136 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Friday 17 April 2026 20:27:23 -0400 (0:00:00.219) 0:18:12.356 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node16 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Friday 17 April 2026 20:27:24 -0400 (0:00:00.484) 0:18:12.841 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node16 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Friday 17 April 2026 20:27:24 -0400 (0:00:00.268) 0:18:13.110 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Friday 17 April 2026 20:27:24 -0400 (0:00:00.177) 0:18:13.287 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Friday 17 April 2026 20:27:24 -0400 (0:00:00.140) 0:18:13.429 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Friday 17 April 2026 20:27:24 -0400 (0:00:00.224) 0:18:13.653 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Friday 17 April 2026 20:27:25 -0400 (0:00:00.266) 0:18:13.919 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Friday 17 April 2026 20:27:25 -0400 (0:00:00.147) 0:18:14.067 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Friday 17 April 2026 20:27:25 -0400 (0:00:00.259) 0:18:14.326 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Friday 17 April 2026 20:27:25 -0400 (0:00:00.290) 0:18:14.617 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node16 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Friday 17 April 2026 20:27:26 -0400 (0:00:00.504) 0:18:15.122 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node16 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Friday 17 April 2026 20:27:26 -0400 (0:00:00.261) 0:18:15.383 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Friday 17 April 2026 20:27:26 -0400 (0:00:00.277) 0:18:15.661 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Friday 17 April 2026 20:27:27 -0400 (0:00:00.302) 0:18:15.964 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Friday 17 April 2026 20:27:27 -0400 (0:00:00.153) 0:18:16.117 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Friday 17 April 2026 20:27:27 -0400 (0:00:00.188) 0:18:16.306 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node16 TASK [Set test variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Friday 17 April 2026 20:27:28 -0400 (0:00:00.425) 0:18:16.732 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Friday 17 April 2026 20:27:28 -0400 (0:00:00.168) 0:18:16.901 ********** skipping: [managed-node16] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Friday 17 April 2026 20:27:28 -0400 (0:00:00.294) 0:18:17.195 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node16 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Friday 17 April 2026 20:27:28 -0400 (0:00:00.291) 0:18:17.487 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Friday 17 April 2026 20:27:29 -0400 (0:00:00.282) 0:18:17.769 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Friday 17 April 2026 20:27:29 -0400 (0:00:00.201) 0:18:17.971 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Friday 17 April 2026 20:27:29 -0400 (0:00:00.222) 0:18:18.193 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Friday 17 April 2026 20:27:29 -0400 (0:00:00.274) 0:18:18.468 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Friday 17 April 2026 20:27:30 -0400 (0:00:00.366) 0:18:18.835 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Friday 17 April 2026 20:27:30 -0400 (0:00:00.138) 0:18:18.974 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Friday 17 April 2026 20:27:30 -0400 (0:00:00.165) 0:18:19.139 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node16 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Friday 17 April 2026 20:27:30 -0400 (0:00:00.391) 0:18:19.530 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node16 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Friday 17 April 2026 20:27:31 -0400 (0:00:00.544) 0:18:20.074 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Friday 17 April 2026 20:27:31 -0400 (0:00:00.164) 0:18:20.239 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Friday 17 April 2026 20:27:31 -0400 (0:00:00.186) 0:18:20.426 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Friday 17 April 2026 20:27:31 -0400 (0:00:00.201) 0:18:20.627 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off - 2] *********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Friday 17 April 2026 20:27:32 -0400 (0:00:00.165) 0:18:20.792 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on - 2] ************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Friday 17 April 2026 20:27:32 -0400 (0:00:00.137) 0:18:20.930 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Friday 17 April 2026 20:27:32 -0400 (0:00:00.237) 0:18:21.167 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Friday 17 April 2026 20:27:32 -0400 (0:00:00.264) 0:18:21.431 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node16 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Friday 17 April 2026 20:27:33 -0400 (0:00:00.501) 0:18:21.933 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Friday 17 April 2026 20:27:33 -0400 (0:00:00.216) 0:18:22.149 ********** skipping: [managed-node16] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Friday 17 April 2026 20:27:33 -0400 (0:00:00.225) 0:18:22.375 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Friday 17 April 2026 20:27:33 -0400 (0:00:00.226) 0:18:22.602 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Friday 17 April 2026 20:27:34 -0400 (0:00:00.225) 0:18:22.827 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Friday 17 April 2026 20:27:34 -0400 (0:00:00.225) 0:18:23.052 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Friday 17 April 2026 20:27:34 -0400 (0:00:00.227) 0:18:23.280 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Friday 17 April 2026 20:27:34 -0400 (0:00:00.248) 0:18:23.528 ********** ok: [managed-node16] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Friday 17 April 2026 20:27:34 -0400 (0:00:00.175) 0:18:23.704 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node16 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Friday 17 April 2026 20:27:35 -0400 (0:00:00.396) 0:18:24.100 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Friday 17 April 2026 20:27:35 -0400 (0:00:00.273) 0:18:24.374 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node16 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Friday 17 April 2026 20:27:36 -0400 (0:00:01.211) 0:18:25.586 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-30ec6405-b599-4e78-a6b8-ead3f359c08c" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Friday 17 April 2026 20:27:37 -0400 (0:00:00.717) 0:18:26.304 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Friday 17 April 2026 20:27:37 -0400 (0:00:00.214) 0:18:26.518 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Friday 17 April 2026 20:27:38 -0400 (0:00:00.259) 0:18:26.778 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Friday 17 April 2026 20:27:38 -0400 (0:00:00.147) 0:18:26.925 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Friday 17 April 2026 20:27:38 -0400 (0:00:00.156) 0:18:27.081 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Friday 17 April 2026 20:27:38 -0400 (0:00:00.148) 0:18:27.230 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Friday 17 April 2026 20:27:38 -0400 (0:00:00.234) 0:18:27.464 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Friday 17 April 2026 20:27:38 -0400 (0:00:00.169) 0:18:27.633 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Friday 17 April 2026 20:27:39 -0400 (0:00:00.250) 0:18:27.884 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Friday 17 April 2026 20:27:39 -0400 (0:00:00.270) 0:18:28.154 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Friday 17 April 2026 20:27:39 -0400 (0:00:00.220) 0:18:28.374 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-30ec6405-b599-4e78-a6b8-ead3f359c08c " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Friday 17 April 2026 20:27:40 -0400 (0:00:00.424) 0:18:28.799 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Friday 17 April 2026 20:27:40 -0400 (0:00:00.308) 0:18:29.107 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Friday 17 April 2026 20:27:40 -0400 (0:00:00.209) 0:18:29.317 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Friday 17 April 2026 20:27:40 -0400 (0:00:00.128) 0:18:29.445 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Friday 17 April 2026 20:27:40 -0400 (0:00:00.219) 0:18:29.665 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Friday 17 April 2026 20:27:41 -0400 (0:00:00.168) 0:18:29.833 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Friday 17 April 2026 20:27:41 -0400 (0:00:00.296) 0:18:30.129 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Friday 17 April 2026 20:27:41 -0400 (0:00:00.248) 0:18:30.378 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776471958.8498685, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776471887.4096534, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 268804, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776471887.4096534, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Friday 17 April 2026 20:27:43 -0400 (0:00:01.810) 0:18:32.188 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Friday 17 April 2026 20:27:43 -0400 (0:00:00.220) 0:18:32.408 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Friday 17 April 2026 20:27:43 -0400 (0:00:00.287) 0:18:32.696 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Friday 17 April 2026 20:27:44 -0400 (0:00:00.213) 0:18:32.910 ********** ok: [managed-node16] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Friday 17 April 2026 20:27:44 -0400 (0:00:00.209) 0:18:33.119 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Friday 17 April 2026 20:27:44 -0400 (0:00:00.258) 0:18:33.378 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Friday 17 April 2026 20:27:44 -0400 (0:00:00.197) 0:18:33.575 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776472014.797037, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776471887.5466537, "dev": 6, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 267984, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776471887.5466537, "nlink": 1, "path": "/dev/mapper/luks-30ec6405-b599-4e78-a6b8-ead3f359c08c", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Friday 17 April 2026 20:27:46 -0400 (0:00:01.523) 0:18:35.099 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Friday 17 April 2026 20:27:50 -0400 (0:00:04.081) 0:18:39.181 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.009615", "end": "2026-04-17 20:27:51.692694", "rc": 0, "start": "2026-04-17 20:27:51.683079" } STDOUT: LUKS header information for /dev/mapper/foo-test1 Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 16384 MK bits: 512 MK digest: a1 78 fd 06 7c 5f 93 28 ab 7b ef da a7 10 08 c8 81 1a b2 c7 MK salt: 1d 8a 9b d6 8c fd 1d a4 d2 e2 dc 83 23 70 07 6c fa 69 4f e0 6c 90 55 87 01 08 01 a6 17 83 9f 45 MK iterations: 120249 UUID: 30ec6405-b599-4e78-a6b8-ead3f359c08c Key Slot 0: ENABLED Iterations: 1927528 Salt: a1 98 f5 d5 93 5b b3 00 72 4e 88 f3 de 8c 15 31 51 f7 34 bd 42 09 f6 09 1c 76 7f d4 e1 cf 8f 5b Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Friday 17 April 2026 20:27:51 -0400 (0:00:01.488) 0:18:40.670 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Friday 17 April 2026 20:27:52 -0400 (0:00:00.365) 0:18:41.036 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Friday 17 April 2026 20:27:52 -0400 (0:00:00.330) 0:18:41.366 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Friday 17 April 2026 20:27:52 -0400 (0:00:00.213) 0:18:41.579 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Friday 17 April 2026 20:27:53 -0400 (0:00:00.341) 0:18:41.920 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Friday 17 April 2026 20:27:53 -0400 (0:00:00.299) 0:18:42.219 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Friday 17 April 2026 20:27:53 -0400 (0:00:00.225) 0:18:42.445 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Friday 17 April 2026 20:27:53 -0400 (0:00:00.261) 0:18:42.707 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-30ec6405-b599-4e78-a6b8-ead3f359c08c /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Friday 17 April 2026 20:27:54 -0400 (0:00:00.261) 0:18:42.968 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Friday 17 April 2026 20:27:54 -0400 (0:00:00.270) 0:18:43.239 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Friday 17 April 2026 20:27:54 -0400 (0:00:00.303) 0:18:43.543 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Friday 17 April 2026 20:27:55 -0400 (0:00:00.194) 0:18:43.737 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Friday 17 April 2026 20:27:55 -0400 (0:00:00.260) 0:18:43.998 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Friday 17 April 2026 20:27:55 -0400 (0:00:00.183) 0:18:44.182 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Friday 17 April 2026 20:27:55 -0400 (0:00:00.341) 0:18:44.523 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Friday 17 April 2026 20:27:55 -0400 (0:00:00.193) 0:18:44.717 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Friday 17 April 2026 20:27:56 -0400 (0:00:00.291) 0:18:45.008 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Friday 17 April 2026 20:27:56 -0400 (0:00:00.241) 0:18:45.249 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Friday 17 April 2026 20:27:56 -0400 (0:00:00.238) 0:18:45.488 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Friday 17 April 2026 20:27:57 -0400 (0:00:00.244) 0:18:45.732 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Friday 17 April 2026 20:27:57 -0400 (0:00:00.256) 0:18:45.989 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Friday 17 April 2026 20:27:57 -0400 (0:00:00.131) 0:18:46.121 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Friday 17 April 2026 20:27:57 -0400 (0:00:00.177) 0:18:46.298 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Friday 17 April 2026 20:27:57 -0400 (0:00:00.144) 0:18:46.443 ********** ok: [managed-node16] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Friday 17 April 2026 20:27:59 -0400 (0:00:01.531) 0:18:47.974 ********** ok: [managed-node16] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Friday 17 April 2026 20:28:00 -0400 (0:00:01.458) 0:18:49.433 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Friday 17 April 2026 20:28:01 -0400 (0:00:00.361) 0:18:49.794 ********** ok: [managed-node16] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Friday 17 April 2026 20:28:01 -0400 (0:00:00.194) 0:18:49.989 ********** ok: [managed-node16] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Friday 17 April 2026 20:28:02 -0400 (0:00:01.441) 0:18:51.430 ********** skipping: [managed-node16] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Friday 17 April 2026 20:28:02 -0400 (0:00:00.225) 0:18:51.655 ********** skipping: [managed-node16] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Friday 17 April 2026 20:28:03 -0400 (0:00:00.281) 0:18:51.937 ********** skipping: [managed-node16] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Friday 17 April 2026 20:28:03 -0400 (0:00:00.257) 0:18:52.194 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Friday 17 April 2026 20:28:03 -0400 (0:00:00.257) 0:18:52.451 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Friday 17 April 2026 20:28:04 -0400 (0:00:00.320) 0:18:52.772 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Friday 17 April 2026 20:28:04 -0400 (0:00:00.285) 0:18:53.057 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Friday 17 April 2026 20:28:04 -0400 (0:00:00.237) 0:18:53.295 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Friday 17 April 2026 20:28:04 -0400 (0:00:00.200) 0:18:53.496 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Friday 17 April 2026 20:28:04 -0400 (0:00:00.144) 0:18:53.640 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Friday 17 April 2026 20:28:05 -0400 (0:00:00.234) 0:18:53.875 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Friday 17 April 2026 20:28:05 -0400 (0:00:00.104) 0:18:53.980 ********** skipping: [managed-node16] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Friday 17 April 2026 20:28:05 -0400 (0:00:00.156) 0:18:54.137 ********** skipping: [managed-node16] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Friday 17 April 2026 20:28:05 -0400 (0:00:00.165) 0:18:54.302 ********** skipping: [managed-node16] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Friday 17 April 2026 20:28:05 -0400 (0:00:00.250) 0:18:54.553 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Friday 17 April 2026 20:28:06 -0400 (0:00:00.209) 0:18:54.762 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Friday 17 April 2026 20:28:06 -0400 (0:00:00.097) 0:18:54.859 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Friday 17 April 2026 20:28:06 -0400 (0:00:00.093) 0:18:54.953 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Friday 17 April 2026 20:28:06 -0400 (0:00:00.093) 0:18:55.046 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Friday 17 April 2026 20:28:06 -0400 (0:00:00.048) 0:18:55.095 ********** ok: [managed-node16] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Friday 17 April 2026 20:28:06 -0400 (0:00:00.059) 0:18:55.154 ********** ok: [managed-node16] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Friday 17 April 2026 20:28:06 -0400 (0:00:00.105) 0:18:55.259 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Friday 17 April 2026 20:28:06 -0400 (0:00:00.158) 0:18:55.418 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.023433", "end": "2026-04-17 20:28:07.526980", "rc": 0, "start": "2026-04-17 20:28:07.503547" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Friday 17 April 2026 20:28:07 -0400 (0:00:01.054) 0:18:56.473 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Friday 17 April 2026 20:28:07 -0400 (0:00:00.097) 0:18:56.570 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Friday 17 April 2026 20:28:08 -0400 (0:00:00.255) 0:18:56.826 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Friday 17 April 2026 20:28:08 -0400 (0:00:00.184) 0:18:57.010 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Friday 17 April 2026 20:28:08 -0400 (0:00:00.224) 0:18:57.235 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Friday 17 April 2026 20:28:08 -0400 (0:00:00.213) 0:18:57.449 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Friday 17 April 2026 20:28:08 -0400 (0:00:00.170) 0:18:57.620 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Friday 17 April 2026 20:28:09 -0400 (0:00:00.208) 0:18:57.829 ********** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Friday 17 April 2026 20:28:09 -0400 (0:00:00.140) 0:18:57.969 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Friday 17 April 2026 20:28:09 -0400 (0:00:00.162) 0:18:58.131 ********** changed: [managed-node16] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 5] ****************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:417 Friday 17 April 2026 20:28:10 -0400 (0:00:01.396) 0:18:59.528 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node16 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Friday 17 April 2026 20:28:11 -0400 (0:00:00.297) 0:18:59.825 ********** ok: [managed-node16] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Friday 17 April 2026 20:28:11 -0400 (0:00:00.231) 0:19:00.056 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node16 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Friday 17 April 2026 20:28:11 -0400 (0:00:00.298) 0:19:00.355 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Friday 17 April 2026 20:28:11 -0400 (0:00:00.181) 0:19:00.536 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 April 2026 20:28:12 -0400 (0:00:00.228) 0:19:00.764 ********** ok: [managed-node16] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Friday 17 April 2026 20:28:13 -0400 (0:00:01.769) 0:19:02.534 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 April 2026 20:28:14 -0400 (0:00:00.212) 0:19:02.746 ********** ok: [managed-node16] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 April 2026 20:28:15 -0400 (0:00:01.787) 0:19:04.534 ********** skipping: [managed-node16] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node16] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 April 2026 20:28:16 -0400 (0:00:00.404) 0:19:04.939 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 April 2026 20:28:16 -0400 (0:00:00.253) 0:19:05.192 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 April 2026 20:28:16 -0400 (0:00:00.189) 0:19:05.382 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 April 2026 20:28:16 -0400 (0:00:00.117) 0:19:05.499 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Friday 17 April 2026 20:28:16 -0400 (0:00:00.165) 0:19:05.664 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 April 2026 20:28:17 -0400 (0:00:00.361) 0:19:06.025 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Friday 17 April 2026 20:28:17 -0400 (0:00:00.210) 0:19:06.235 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Friday 17 April 2026 20:28:17 -0400 (0:00:00.171) 0:19:06.407 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Friday 17 April 2026 20:28:21 -0400 (0:00:04.149) 0:19:10.556 ********** ok: [managed-node16] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 17 April 2026 20:28:22 -0400 (0:00:00.286) 0:19:10.842 ********** ok: [managed-node16] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 April 2026 20:28:22 -0400 (0:00:00.278) 0:19:11.122 ********** ok: [managed-node16] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Friday 17 April 2026 20:28:28 -0400 (0:00:05.839) 0:19:16.961 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 April 2026 20:28:28 -0400 (0:00:00.502) 0:19:17.463 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 April 2026 20:28:28 -0400 (0:00:00.140) 0:19:17.604 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 April 2026 20:28:29 -0400 (0:00:00.264) 0:19:17.868 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Friday 17 April 2026 20:28:29 -0400 (0:00:00.195) 0:19:18.063 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Friday 17 April 2026 20:28:33 -0400 (0:00:04.180) 0:19:22.244 ********** ok: [managed-node16] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service": { "name": "systemd-cryptsetup@luk...db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2d30ec6405\\x2db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service": { "name": "systemd-cryptsetup@luks\\x2d30ec6405\\x2db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Friday 17 April 2026 20:28:36 -0400 (0:00:03.186) 0:19:25.431 ********** changed: [managed-node16] => (item=systemd-cryptsetup@luks\x2d30ec6405\x2db599\x2d4e78\x2da6b8\x2dead3f359c08c.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d30ec6405\\x2db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "name": "systemd-cryptsetup@luks\\x2d30ec6405\\x2db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "cryptsetup-pre.target systemd-journald.socket system-systemd\\x2dcryptsetup.slice dev-mapper-foo\\x2dtest1.device", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-mapper-foo\\x2dtest1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-30ec6405-b599-4e78-a6b8-ead3f359c08c", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-30ec6405-b599-4e78-a6b8-ead3f359c08c /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-30ec6405-b599-4e78-a6b8-ead3f359c08c ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d30ec6405\\x2db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d30ec6405\\x2db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d30ec6405\\x2db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2d30ec6405\\x2db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.device", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Fri 2026-04-17 20:26:49 EDT", "StateChangeTimestampMonotonic": "2667845278", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node16] => (item=systemd-cryptsetup@luk...db599\x2d4e78\x2da6b8\x2dead3f359c08c.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "name": "systemd-cryptsetup@luk...db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Friday 17 April 2026 20:28:40 -0400 (0:00:03.483) 0:19:28.914 ********** fatal: [managed-node16]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-30ec6405-b599-4e78-a6b8-ead3f359c08c' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Friday 17 April 2026 20:28:45 -0400 (0:00:05.666) 0:19:34.580 ********** fatal: [managed-node16]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'luks-30ec6405-b599-4e78-a6b8-ead3f359c08c' in safe mode due to encryption removal", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks1', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Friday 17 April 2026 20:28:47 -0400 (0:00:01.221) 0:19:35.802 ********** changed: [managed-node16] => (item=systemd-cryptsetup@luks\x2d30ec6405\x2db599\x2d4e78\x2da6b8\x2dead3f359c08c.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d30ec6405\\x2db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "name": "systemd-cryptsetup@luks\\x2d30ec6405\\x2db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d30ec6405\\x2db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d30ec6405\\x2db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d30ec6405\\x2db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d30ec6405\\x2db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "RequiredBy": "dev-mapper-luks\\x2d30ec6405\\x2db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.device cryptsetup.target", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Fri 2026-04-17 20:26:49 EDT", "StateChangeTimestampMonotonic": "2667845278", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node16] => (item=systemd-cryptsetup@luk...db599\x2d4e78\x2da6b8\x2dead3f359c08c.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "name": "systemd-cryptsetup@luk...db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Friday 17 April 2026 20:28:50 -0400 (0:00:03.676) 0:19:39.479 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Friday 17 April 2026 20:28:50 -0400 (0:00:00.225) 0:19:39.704 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Friday 17 April 2026 20:28:51 -0400 (0:00:00.274) 0:19:39.979 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Friday 17 April 2026 20:28:51 -0400 (0:00:00.119) 0:19:40.098 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776472090.5842676, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1776472090.5842676, "dev": 64769, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1776472090.5842676, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "1608311293", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Friday 17 April 2026 20:28:52 -0400 (0:00:01.268) 0:19:41.366 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer - 3] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:440 Friday 17 April 2026 20:28:52 -0400 (0:00:00.230) 0:19:41.597 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node16 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Friday 17 April 2026 20:28:53 -0400 (0:00:00.464) 0:19:42.061 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Friday 17 April 2026 20:28:53 -0400 (0:00:00.211) 0:19:42.272 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 April 2026 20:28:53 -0400 (0:00:00.292) 0:19:42.565 ********** ok: [managed-node16] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Friday 17 April 2026 20:28:55 -0400 (0:00:01.336) 0:19:43.902 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 April 2026 20:28:55 -0400 (0:00:00.062) 0:19:43.965 ********** ok: [managed-node16] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 April 2026 20:28:56 -0400 (0:00:01.366) 0:19:45.331 ********** skipping: [managed-node16] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node16] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 April 2026 20:28:57 -0400 (0:00:00.476) 0:19:45.807 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 April 2026 20:28:57 -0400 (0:00:00.251) 0:19:46.059 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 April 2026 20:28:57 -0400 (0:00:00.164) 0:19:46.224 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 April 2026 20:28:57 -0400 (0:00:00.146) 0:19:46.371 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Friday 17 April 2026 20:28:57 -0400 (0:00:00.111) 0:19:46.483 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 April 2026 20:28:58 -0400 (0:00:00.368) 0:19:46.851 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Friday 17 April 2026 20:28:58 -0400 (0:00:00.128) 0:19:46.980 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Friday 17 April 2026 20:28:58 -0400 (0:00:00.104) 0:19:47.085 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Friday 17 April 2026 20:29:02 -0400 (0:00:03.995) 0:19:51.080 ********** ok: [managed-node16] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 17 April 2026 20:29:02 -0400 (0:00:00.239) 0:19:51.319 ********** ok: [managed-node16] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 April 2026 20:29:02 -0400 (0:00:00.172) 0:19:51.492 ********** ok: [managed-node16] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Friday 17 April 2026 20:29:08 -0400 (0:00:05.443) 0:19:56.936 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 April 2026 20:29:08 -0400 (0:00:00.294) 0:19:57.230 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 April 2026 20:29:08 -0400 (0:00:00.354) 0:19:57.585 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 April 2026 20:29:09 -0400 (0:00:00.166) 0:19:57.751 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Friday 17 April 2026 20:29:09 -0400 (0:00:00.233) 0:19:57.984 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Friday 17 April 2026 20:29:13 -0400 (0:00:03.986) 0:20:01.971 ********** ok: [managed-node16] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service": { "name": "systemd-cryptsetup@luk...db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2d30ec6405\\x2db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service": { "name": "systemd-cryptsetup@luks\\x2d30ec6405\\x2db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Friday 17 April 2026 20:29:16 -0400 (0:00:02.834) 0:20:04.806 ********** changed: [managed-node16] => (item=systemd-cryptsetup@luks\x2d30ec6405\x2db599\x2d4e78\x2da6b8\x2dead3f359c08c.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d30ec6405\\x2db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "name": "systemd-cryptsetup@luks\\x2d30ec6405\\x2db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "cryptsetup-pre.target systemd-journald.socket dev-mapper-foo\\x2dtest1.device system-systemd\\x2dcryptsetup.slice", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-mapper-foo\\x2dtest1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-30ec6405-b599-4e78-a6b8-ead3f359c08c", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-30ec6405-b599-4e78-a6b8-ead3f359c08c /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-30ec6405-b599-4e78-a6b8-ead3f359c08c ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d30ec6405\\x2db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d30ec6405\\x2db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d30ec6405\\x2db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2d30ec6405\\x2db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.device", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Fri 2026-04-17 20:26:49 EDT", "StateChangeTimestampMonotonic": "2667845278", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node16] => (item=systemd-cryptsetup@luk...db599\x2d4e78\x2da6b8\x2dead3f359c08c.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "name": "systemd-cryptsetup@luk...db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Friday 17 April 2026 20:29:19 -0400 (0:00:03.063) 0:20:07.869 ********** changed: [managed-node16] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-30ec6405-b599-4e78-a6b8-ead3f359c08c", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-30ec6405-b599-4e78-a6b8-ead3f359c08c", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-30ec6405-b599-4e78-a6b8-ead3f359c08c", "password": "-", "state": "absent" } ], "leaves": [ "/dev/mapper/foo-test1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-30ec6405-b599-4e78-a6b8-ead3f359c08c", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Friday 17 April 2026 20:29:24 -0400 (0:00:05.650) 0:20:13.520 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Friday 17 April 2026 20:29:25 -0400 (0:00:00.326) 0:20:13.846 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776471897.1606827, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "0cfa1462ec86ed343d675e75474e5bed1baf7336", "ctime": 1776471897.1576827, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 419430537, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776471897.1576827, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "571036750", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Friday 17 April 2026 20:29:26 -0400 (0:00:01.645) 0:20:15.491 ********** ok: [managed-node16] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Friday 17 April 2026 20:29:28 -0400 (0:00:01.624) 0:20:17.116 ********** changed: [managed-node16] => (item=systemd-cryptsetup@luks\x2d30ec6405\x2db599\x2d4e78\x2da6b8\x2dead3f359c08c.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d30ec6405\\x2db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "name": "systemd-cryptsetup@luks\\x2d30ec6405\\x2db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d30ec6405\\x2db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d30ec6405\\x2db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d30ec6405\\x2db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d30ec6405\\x2db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2d30ec6405\\x2db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.device", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Fri 2026-04-17 20:26:49 EDT", "StateChangeTimestampMonotonic": "2667845278", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node16] => (item=systemd-cryptsetup@luk...db599\x2d4e78\x2da6b8\x2dead3f359c08c.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "name": "systemd-cryptsetup@luk...db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Friday 17 April 2026 20:29:32 -0400 (0:00:03.624) 0:20:20.741 ********** ok: [managed-node16] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-30ec6405-b599-4e78-a6b8-ead3f359c08c", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-30ec6405-b599-4e78-a6b8-ead3f359c08c", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-30ec6405-b599-4e78-a6b8-ead3f359c08c", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/mapper/foo-test1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-30ec6405-b599-4e78-a6b8-ead3f359c08c", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Friday 17 April 2026 20:29:32 -0400 (0:00:00.253) 0:20:20.995 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Friday 17 April 2026 20:29:32 -0400 (0:00:00.191) 0:20:21.186 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Friday 17 April 2026 20:29:32 -0400 (0:00:00.198) 0:20:21.385 ********** changed: [managed-node16] => (item={'src': '/dev/mapper/luks-30ec6405-b599-4e78-a6b8-ead3f359c08c', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-30ec6405-b599-4e78-a6b8-ead3f359c08c", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-30ec6405-b599-4e78-a6b8-ead3f359c08c" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Friday 17 April 2026 20:29:34 -0400 (0:00:01.789) 0:20:23.174 ********** ok: [managed-node16] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Friday 17 April 2026 20:29:36 -0400 (0:00:01.939) 0:20:25.113 ********** changed: [managed-node16] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Friday 17 April 2026 20:29:38 -0400 (0:00:01.639) 0:20:26.753 ********** skipping: [managed-node16] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Friday 17 April 2026 20:29:38 -0400 (0:00:00.363) 0:20:27.117 ********** ok: [managed-node16] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Friday 17 April 2026 20:29:40 -0400 (0:00:01.944) 0:20:29.061 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776471912.1297278, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "f16117cc80b0fc2fda7da96b15f88e44870128ac", "ctime": 1776471904.189704, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 270532806, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1776471904.189704, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "391408018", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Friday 17 April 2026 20:29:41 -0400 (0:00:01.534) 0:20:30.595 ********** changed: [managed-node16] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-30ec6405-b599-4e78-a6b8-ead3f359c08c', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-30ec6405-b599-4e78-a6b8-ead3f359c08c", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Friday 17 April 2026 20:29:43 -0400 (0:00:01.603) 0:20:32.199 ********** ok: [managed-node16] TASK [Verify role results - 9] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:455 Friday 17 April 2026 20:29:45 -0400 (0:00:01.854) 0:20:34.054 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node16 TASK [Print out pool information] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Friday 17 April 2026 20:29:45 -0400 (0:00:00.374) 0:20:34.428 ********** ok: [managed-node16] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Friday 17 April 2026 20:29:45 -0400 (0:00:00.265) 0:20:34.693 ********** skipping: [managed-node16] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Friday 17 April 2026 20:29:46 -0400 (0:00:00.178) 0:20:34.871 ********** ok: [managed-node16] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "3f16b262-3e2a-425d-a88e-52f1c907552e" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "MXMlX8-1TPL-giWv-f9GY-miKG-0a91-sT0DvI" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Friday 17 April 2026 20:29:47 -0400 (0:00:01.396) 0:20:36.268 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002360", "end": "2026-04-17 20:29:48.514581", "rc": 0, "start": "2026-04-17 20:29:48.512221" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Friday 17 April 2026 20:29:48 -0400 (0:00:01.175) 0:20:37.444 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002888", "end": "2026-04-17 20:29:50.122522", "failed_when_result": false, "rc": 0, "start": "2026-04-17 20:29:50.119634" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Friday 17 April 2026 20:29:50 -0400 (0:00:01.646) 0:20:39.091 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node16 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Friday 17 April 2026 20:29:50 -0400 (0:00:00.505) 0:20:39.597 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Friday 17 April 2026 20:29:51 -0400 (0:00:00.161) 0:20:39.758 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.023967", "end": "2026-04-17 20:29:52.484952", "rc": 0, "start": "2026-04-17 20:29:52.460985" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Friday 17 April 2026 20:29:52 -0400 (0:00:01.742) 0:20:41.500 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Friday 17 April 2026 20:29:53 -0400 (0:00:00.436) 0:20:41.937 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node16 TASK [Set test variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Friday 17 April 2026 20:29:53 -0400 (0:00:00.534) 0:20:42.471 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Friday 17 April 2026 20:29:54 -0400 (0:00:00.271) 0:20:42.743 ********** ok: [managed-node16] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Friday 17 April 2026 20:29:55 -0400 (0:00:01.493) 0:20:44.236 ********** ok: [managed-node16] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Friday 17 April 2026 20:29:55 -0400 (0:00:00.278) 0:20:44.515 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Friday 17 April 2026 20:29:57 -0400 (0:00:01.449) 0:20:45.964 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Friday 17 April 2026 20:29:57 -0400 (0:00:00.362) 0:20:46.327 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Friday 17 April 2026 20:29:57 -0400 (0:00:00.367) 0:20:46.694 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Friday 17 April 2026 20:29:58 -0400 (0:00:00.274) 0:20:46.969 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Friday 17 April 2026 20:29:58 -0400 (0:00:00.418) 0:20:47.387 ********** ok: [managed-node16] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Friday 17 April 2026 20:29:59 -0400 (0:00:00.377) 0:20:47.765 ********** ok: [managed-node16] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.9.63 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Friday 17 April 2026 20:30:00 -0400 (0:00:01.770) 0:20:49.535 ********** skipping: [managed-node16] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Friday 17 April 2026 20:30:01 -0400 (0:00:00.420) 0:20:49.956 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node16 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Friday 17 April 2026 20:30:01 -0400 (0:00:00.556) 0:20:50.512 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Friday 17 April 2026 20:30:02 -0400 (0:00:00.235) 0:20:50.748 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Friday 17 April 2026 20:30:02 -0400 (0:00:00.240) 0:20:50.988 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Friday 17 April 2026 20:30:02 -0400 (0:00:00.216) 0:20:51.204 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Friday 17 April 2026 20:30:02 -0400 (0:00:00.197) 0:20:51.402 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Friday 17 April 2026 20:30:02 -0400 (0:00:00.231) 0:20:51.633 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Friday 17 April 2026 20:30:03 -0400 (0:00:00.241) 0:20:51.875 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Friday 17 April 2026 20:30:03 -0400 (0:00:00.269) 0:20:52.145 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Friday 17 April 2026 20:30:03 -0400 (0:00:00.246) 0:20:52.391 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Friday 17 April 2026 20:30:03 -0400 (0:00:00.290) 0:20:52.682 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Friday 17 April 2026 20:30:04 -0400 (0:00:00.289) 0:20:52.972 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Friday 17 April 2026 20:30:04 -0400 (0:00:00.187) 0:20:53.159 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node16 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Friday 17 April 2026 20:30:04 -0400 (0:00:00.412) 0:20:53.572 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node16 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Friday 17 April 2026 20:30:05 -0400 (0:00:00.457) 0:20:54.030 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Friday 17 April 2026 20:30:05 -0400 (0:00:00.365) 0:20:54.395 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Friday 17 April 2026 20:30:05 -0400 (0:00:00.262) 0:20:54.658 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Friday 17 April 2026 20:30:06 -0400 (0:00:00.261) 0:20:54.919 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Friday 17 April 2026 20:30:06 -0400 (0:00:00.264) 0:20:55.183 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Friday 17 April 2026 20:30:06 -0400 (0:00:00.225) 0:20:55.409 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Friday 17 April 2026 20:30:07 -0400 (0:00:00.328) 0:20:55.737 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Friday 17 April 2026 20:30:07 -0400 (0:00:00.268) 0:20:56.006 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node16 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Friday 17 April 2026 20:30:07 -0400 (0:00:00.626) 0:20:56.633 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node16 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Friday 17 April 2026 20:30:08 -0400 (0:00:00.420) 0:20:57.053 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Friday 17 April 2026 20:30:08 -0400 (0:00:00.189) 0:20:57.242 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Friday 17 April 2026 20:30:08 -0400 (0:00:00.287) 0:20:57.530 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Friday 17 April 2026 20:30:09 -0400 (0:00:00.299) 0:20:57.829 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Friday 17 April 2026 20:30:09 -0400 (0:00:00.247) 0:20:58.077 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node16 TASK [Set test variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Friday 17 April 2026 20:30:09 -0400 (0:00:00.645) 0:20:58.723 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Friday 17 April 2026 20:30:10 -0400 (0:00:00.449) 0:20:59.172 ********** skipping: [managed-node16] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Friday 17 April 2026 20:30:10 -0400 (0:00:00.225) 0:20:59.398 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node16 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Friday 17 April 2026 20:30:11 -0400 (0:00:00.560) 0:20:59.958 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Friday 17 April 2026 20:30:11 -0400 (0:00:00.423) 0:21:00.382 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Friday 17 April 2026 20:30:12 -0400 (0:00:00.379) 0:21:00.762 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Friday 17 April 2026 20:30:12 -0400 (0:00:00.408) 0:21:01.170 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Friday 17 April 2026 20:30:12 -0400 (0:00:00.256) 0:21:01.427 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Friday 17 April 2026 20:30:12 -0400 (0:00:00.275) 0:21:01.702 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Friday 17 April 2026 20:30:13 -0400 (0:00:00.233) 0:21:01.935 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Friday 17 April 2026 20:30:13 -0400 (0:00:00.112) 0:21:02.047 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node16 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Friday 17 April 2026 20:30:13 -0400 (0:00:00.476) 0:21:02.524 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node16 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Friday 17 April 2026 20:30:14 -0400 (0:00:00.321) 0:21:02.845 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Friday 17 April 2026 20:30:14 -0400 (0:00:00.198) 0:21:03.044 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Friday 17 April 2026 20:30:14 -0400 (0:00:00.180) 0:21:03.225 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Friday 17 April 2026 20:30:14 -0400 (0:00:00.161) 0:21:03.386 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off - 2] *********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Friday 17 April 2026 20:30:14 -0400 (0:00:00.197) 0:21:03.583 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on - 2] ************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Friday 17 April 2026 20:30:15 -0400 (0:00:00.233) 0:21:03.817 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Friday 17 April 2026 20:30:15 -0400 (0:00:00.230) 0:21:04.048 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Friday 17 April 2026 20:30:15 -0400 (0:00:00.195) 0:21:04.244 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node16 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Friday 17 April 2026 20:30:15 -0400 (0:00:00.365) 0:21:04.609 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Friday 17 April 2026 20:30:16 -0400 (0:00:00.203) 0:21:04.813 ********** skipping: [managed-node16] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Friday 17 April 2026 20:30:16 -0400 (0:00:00.101) 0:21:04.914 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Friday 17 April 2026 20:30:16 -0400 (0:00:00.319) 0:21:05.234 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Friday 17 April 2026 20:30:16 -0400 (0:00:00.206) 0:21:05.440 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Friday 17 April 2026 20:30:16 -0400 (0:00:00.279) 0:21:05.719 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Friday 17 April 2026 20:30:17 -0400 (0:00:00.207) 0:21:05.926 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Friday 17 April 2026 20:30:17 -0400 (0:00:00.247) 0:21:06.174 ********** ok: [managed-node16] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Friday 17 April 2026 20:30:17 -0400 (0:00:00.203) 0:21:06.377 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node16 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Friday 17 April 2026 20:30:17 -0400 (0:00:00.345) 0:21:06.723 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Friday 17 April 2026 20:30:18 -0400 (0:00:00.204) 0:21:06.927 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node16 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Friday 17 April 2026 20:30:19 -0400 (0:00:00.944) 0:21:07.872 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Friday 17 April 2026 20:30:19 -0400 (0:00:00.117) 0:21:07.989 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Friday 17 April 2026 20:30:19 -0400 (0:00:00.128) 0:21:08.118 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Friday 17 April 2026 20:30:19 -0400 (0:00:00.279) 0:21:08.398 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Friday 17 April 2026 20:30:19 -0400 (0:00:00.282) 0:21:08.681 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Friday 17 April 2026 20:30:20 -0400 (0:00:00.241) 0:21:08.923 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Friday 17 April 2026 20:30:20 -0400 (0:00:00.213) 0:21:09.136 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Friday 17 April 2026 20:30:20 -0400 (0:00:00.146) 0:21:09.282 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Friday 17 April 2026 20:30:20 -0400 (0:00:00.209) 0:21:09.491 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Friday 17 April 2026 20:30:20 -0400 (0:00:00.234) 0:21:09.726 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Friday 17 April 2026 20:30:21 -0400 (0:00:00.206) 0:21:09.933 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Friday 17 April 2026 20:30:21 -0400 (0:00:00.149) 0:21:10.082 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Friday 17 April 2026 20:30:21 -0400 (0:00:00.383) 0:21:10.465 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Friday 17 April 2026 20:30:21 -0400 (0:00:00.209) 0:21:10.675 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Friday 17 April 2026 20:30:22 -0400 (0:00:00.181) 0:21:10.856 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Friday 17 April 2026 20:30:22 -0400 (0:00:00.151) 0:21:11.007 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Friday 17 April 2026 20:30:22 -0400 (0:00:00.111) 0:21:11.119 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Friday 17 April 2026 20:30:22 -0400 (0:00:00.143) 0:21:11.262 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Friday 17 April 2026 20:30:22 -0400 (0:00:00.223) 0:21:11.486 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Friday 17 April 2026 20:30:23 -0400 (0:00:00.317) 0:21:11.803 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776472164.5094929, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776472164.5094929, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 300885, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776472164.5094929, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Friday 17 April 2026 20:30:24 -0400 (0:00:01.310) 0:21:13.114 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Friday 17 April 2026 20:30:24 -0400 (0:00:00.157) 0:21:13.272 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Friday 17 April 2026 20:30:24 -0400 (0:00:00.255) 0:21:13.527 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Friday 17 April 2026 20:30:24 -0400 (0:00:00.189) 0:21:13.717 ********** ok: [managed-node16] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Friday 17 April 2026 20:30:25 -0400 (0:00:00.205) 0:21:13.923 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Friday 17 April 2026 20:30:25 -0400 (0:00:00.168) 0:21:14.091 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Friday 17 April 2026 20:30:25 -0400 (0:00:00.246) 0:21:14.337 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Friday 17 April 2026 20:30:25 -0400 (0:00:00.179) 0:21:14.516 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Friday 17 April 2026 20:30:29 -0400 (0:00:03.996) 0:21:18.513 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Friday 17 April 2026 20:30:30 -0400 (0:00:00.235) 0:21:18.748 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Friday 17 April 2026 20:30:30 -0400 (0:00:00.925) 0:21:19.674 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Friday 17 April 2026 20:30:31 -0400 (0:00:00.338) 0:21:20.013 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Friday 17 April 2026 20:30:31 -0400 (0:00:00.276) 0:21:20.289 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Friday 17 April 2026 20:30:31 -0400 (0:00:00.142) 0:21:20.431 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Friday 17 April 2026 20:30:31 -0400 (0:00:00.136) 0:21:20.567 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Friday 17 April 2026 20:30:31 -0400 (0:00:00.112) 0:21:20.680 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Friday 17 April 2026 20:30:32 -0400 (0:00:00.215) 0:21:20.896 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Friday 17 April 2026 20:30:32 -0400 (0:00:00.161) 0:21:21.057 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Friday 17 April 2026 20:30:32 -0400 (0:00:00.151) 0:21:21.209 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Friday 17 April 2026 20:30:32 -0400 (0:00:00.277) 0:21:21.486 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Friday 17 April 2026 20:30:32 -0400 (0:00:00.190) 0:21:21.677 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Friday 17 April 2026 20:30:33 -0400 (0:00:00.231) 0:21:21.908 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Friday 17 April 2026 20:30:33 -0400 (0:00:00.292) 0:21:22.201 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Friday 17 April 2026 20:30:33 -0400 (0:00:00.233) 0:21:22.435 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Friday 17 April 2026 20:30:34 -0400 (0:00:00.303) 0:21:22.739 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Friday 17 April 2026 20:30:34 -0400 (0:00:00.201) 0:21:22.940 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Friday 17 April 2026 20:30:34 -0400 (0:00:00.255) 0:21:23.196 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Friday 17 April 2026 20:30:34 -0400 (0:00:00.242) 0:21:23.439 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Friday 17 April 2026 20:30:34 -0400 (0:00:00.108) 0:21:23.547 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Friday 17 April 2026 20:30:35 -0400 (0:00:00.180) 0:21:23.728 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Friday 17 April 2026 20:30:35 -0400 (0:00:00.122) 0:21:23.851 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Friday 17 April 2026 20:30:35 -0400 (0:00:00.175) 0:21:24.026 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Friday 17 April 2026 20:30:35 -0400 (0:00:00.158) 0:21:24.184 ********** ok: [managed-node16] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Friday 17 April 2026 20:30:36 -0400 (0:00:01.081) 0:21:25.266 ********** ok: [managed-node16] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Friday 17 April 2026 20:30:37 -0400 (0:00:01.094) 0:21:26.360 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Friday 17 April 2026 20:30:37 -0400 (0:00:00.112) 0:21:26.472 ********** ok: [managed-node16] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Friday 17 April 2026 20:30:37 -0400 (0:00:00.163) 0:21:26.636 ********** ok: [managed-node16] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Friday 17 April 2026 20:30:39 -0400 (0:00:01.187) 0:21:27.823 ********** skipping: [managed-node16] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Friday 17 April 2026 20:30:39 -0400 (0:00:00.213) 0:21:28.036 ********** skipping: [managed-node16] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Friday 17 April 2026 20:30:39 -0400 (0:00:00.342) 0:21:28.378 ********** skipping: [managed-node16] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Friday 17 April 2026 20:30:39 -0400 (0:00:00.149) 0:21:28.528 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Friday 17 April 2026 20:30:40 -0400 (0:00:00.220) 0:21:28.764 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Friday 17 April 2026 20:30:40 -0400 (0:00:00.201) 0:21:28.966 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Friday 17 April 2026 20:30:40 -0400 (0:00:00.193) 0:21:29.159 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Friday 17 April 2026 20:30:40 -0400 (0:00:00.162) 0:21:29.322 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Friday 17 April 2026 20:30:40 -0400 (0:00:00.268) 0:21:29.591 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Friday 17 April 2026 20:30:41 -0400 (0:00:00.160) 0:21:29.762 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Friday 17 April 2026 20:30:41 -0400 (0:00:00.177) 0:21:29.940 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Friday 17 April 2026 20:30:41 -0400 (0:00:00.172) 0:21:30.113 ********** skipping: [managed-node16] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Friday 17 April 2026 20:30:41 -0400 (0:00:00.269) 0:21:30.382 ********** skipping: [managed-node16] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Friday 17 April 2026 20:30:41 -0400 (0:00:00.042) 0:21:30.425 ********** skipping: [managed-node16] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Friday 17 April 2026 20:30:41 -0400 (0:00:00.157) 0:21:30.582 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Friday 17 April 2026 20:30:41 -0400 (0:00:00.093) 0:21:30.675 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Friday 17 April 2026 20:30:42 -0400 (0:00:00.084) 0:21:30.759 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Friday 17 April 2026 20:30:42 -0400 (0:00:00.065) 0:21:30.824 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Friday 17 April 2026 20:30:42 -0400 (0:00:00.033) 0:21:30.858 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Friday 17 April 2026 20:30:42 -0400 (0:00:00.142) 0:21:31.001 ********** ok: [managed-node16] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Friday 17 April 2026 20:30:42 -0400 (0:00:00.077) 0:21:31.078 ********** ok: [managed-node16] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Friday 17 April 2026 20:30:42 -0400 (0:00:00.079) 0:21:31.157 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Friday 17 April 2026 20:30:42 -0400 (0:00:00.073) 0:21:31.230 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.023050", "end": "2026-04-17 20:30:43.166918", "rc": 0, "start": "2026-04-17 20:30:43.143868" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Friday 17 April 2026 20:30:43 -0400 (0:00:00.885) 0:21:32.116 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Friday 17 April 2026 20:30:43 -0400 (0:00:00.210) 0:21:32.327 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Friday 17 April 2026 20:30:43 -0400 (0:00:00.124) 0:21:32.451 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Friday 17 April 2026 20:30:43 -0400 (0:00:00.146) 0:21:32.598 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Friday 17 April 2026 20:30:43 -0400 (0:00:00.070) 0:21:32.668 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Friday 17 April 2026 20:30:44 -0400 (0:00:00.130) 0:21:32.799 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Friday 17 April 2026 20:30:44 -0400 (0:00:00.095) 0:21:32.895 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Friday 17 April 2026 20:30:44 -0400 (0:00:00.128) 0:21:33.023 ********** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Friday 17 April 2026 20:30:44 -0400 (0:00:00.091) 0:21:33.115 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Friday 17 April 2026 20:30:44 -0400 (0:00:00.156) 0:21:33.271 ********** changed: [managed-node16] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 6] ****************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:461 Friday 17 April 2026 20:30:45 -0400 (0:00:01.092) 0:21:34.364 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node16 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Friday 17 April 2026 20:30:45 -0400 (0:00:00.153) 0:21:34.517 ********** ok: [managed-node16] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Friday 17 April 2026 20:30:45 -0400 (0:00:00.074) 0:21:34.592 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node16 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Friday 17 April 2026 20:30:45 -0400 (0:00:00.065) 0:21:34.658 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Friday 17 April 2026 20:30:45 -0400 (0:00:00.028) 0:21:34.686 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 April 2026 20:30:46 -0400 (0:00:00.052) 0:21:34.738 ********** ok: [managed-node16] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Friday 17 April 2026 20:30:47 -0400 (0:00:01.262) 0:21:36.001 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 April 2026 20:30:47 -0400 (0:00:00.123) 0:21:36.125 ********** ok: [managed-node16] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 April 2026 20:30:48 -0400 (0:00:01.268) 0:21:37.394 ********** skipping: [managed-node16] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node16] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 April 2026 20:30:48 -0400 (0:00:00.235) 0:21:37.629 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 April 2026 20:30:49 -0400 (0:00:00.114) 0:21:37.743 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 April 2026 20:30:49 -0400 (0:00:00.227) 0:21:37.971 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 April 2026 20:30:49 -0400 (0:00:00.155) 0:21:38.127 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Friday 17 April 2026 20:30:49 -0400 (0:00:00.219) 0:21:38.346 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 April 2026 20:30:49 -0400 (0:00:00.338) 0:21:38.685 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Friday 17 April 2026 20:30:50 -0400 (0:00:00.111) 0:21:38.796 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Friday 17 April 2026 20:30:50 -0400 (0:00:00.210) 0:21:39.007 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Friday 17 April 2026 20:30:53 -0400 (0:00:03.458) 0:21:42.465 ********** ok: [managed-node16] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 17 April 2026 20:30:53 -0400 (0:00:00.108) 0:21:42.573 ********** ok: [managed-node16] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 April 2026 20:30:53 -0400 (0:00:00.135) 0:21:42.709 ********** ok: [managed-node16] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Friday 17 April 2026 20:30:58 -0400 (0:00:04.813) 0:21:47.523 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 April 2026 20:30:58 -0400 (0:00:00.164) 0:21:47.687 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 April 2026 20:30:59 -0400 (0:00:00.078) 0:21:47.766 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 April 2026 20:30:59 -0400 (0:00:00.104) 0:21:47.870 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Friday 17 April 2026 20:30:59 -0400 (0:00:00.031) 0:21:47.902 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Friday 17 April 2026 20:31:02 -0400 (0:00:03.061) 0:21:50.963 ********** ok: [managed-node16] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service": { "name": "systemd-cryptsetup@luk...db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2d30ec6405\\x2db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service": { "name": "systemd-cryptsetup@luks\\x2d30ec6405\\x2db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Friday 17 April 2026 20:31:04 -0400 (0:00:02.289) 0:21:53.253 ********** changed: [managed-node16] => (item=systemd-cryptsetup@luks\x2d30ec6405\x2db599\x2d4e78\x2da6b8\x2dead3f359c08c.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d30ec6405\\x2db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "name": "systemd-cryptsetup@luks\\x2d30ec6405\\x2db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "dev-mapper-foo\\x2dtest1.device systemd-journald.socket cryptsetup-pre.target system-systemd\\x2dcryptsetup.slice", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-mapper-foo\\x2dtest1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-30ec6405-b599-4e78-a6b8-ead3f359c08c", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-30ec6405-b599-4e78-a6b8-ead3f359c08c /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-30ec6405-b599-4e78-a6b8-ead3f359c08c ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d30ec6405\\x2db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d30ec6405\\x2db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d30ec6405\\x2db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Fri 2026-04-17 20:26:49 EDT", "StateChangeTimestampMonotonic": "2667845278", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node16] => (item=systemd-cryptsetup@luk...db599\x2d4e78\x2da6b8\x2dead3f359c08c.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "name": "systemd-cryptsetup@luk...db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Friday 17 April 2026 20:31:07 -0400 (0:00:02.939) 0:21:56.193 ********** fatal: [managed-node16]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'test1' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Friday 17 April 2026 20:31:12 -0400 (0:00:05.291) 0:22:01.485 ********** fatal: [managed-node16]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'test1' in safe mode due to adding encryption", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Friday 17 April 2026 20:31:12 -0400 (0:00:00.196) 0:22:01.681 ********** changed: [managed-node16] => (item=systemd-cryptsetup@luks\x2d30ec6405\x2db599\x2d4e78\x2da6b8\x2dead3f359c08c.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d30ec6405\\x2db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "name": "systemd-cryptsetup@luks\\x2d30ec6405\\x2db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d30ec6405\\x2db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d30ec6405\\x2db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d30ec6405\\x2db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d30ec6405\\x2db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node16] => (item=systemd-cryptsetup@luk...db599\x2d4e78\x2da6b8\x2dead3f359c08c.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "name": "systemd-cryptsetup@luk...db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...db599\\x2d4e78\\x2da6b8\\x2dead3f359c08c.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Friday 17 April 2026 20:31:15 -0400 (0:00:02.326) 0:22:04.008 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Friday 17 April 2026 20:31:15 -0400 (0:00:00.104) 0:22:04.113 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Friday 17 April 2026 20:31:15 -0400 (0:00:00.284) 0:22:04.398 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Friday 17 April 2026 20:31:15 -0400 (0:00:00.099) 0:22:04.497 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776472245.5107396, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1776472245.5107396, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1776472245.5107396, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "533344363", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Friday 17 April 2026 20:31:16 -0400 (0:00:01.005) 0:22:05.503 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Add encryption to the volume - 3] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:484 Friday 17 April 2026 20:31:16 -0400 (0:00:00.105) 0:22:05.608 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node16 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Friday 17 April 2026 20:31:17 -0400 (0:00:00.133) 0:22:05.742 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Friday 17 April 2026 20:31:17 -0400 (0:00:00.044) 0:22:05.786 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 April 2026 20:31:17 -0400 (0:00:00.079) 0:22:05.866 ********** ok: [managed-node16] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Friday 17 April 2026 20:31:18 -0400 (0:00:01.258) 0:22:07.124 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 April 2026 20:31:18 -0400 (0:00:00.134) 0:22:07.259 ********** ok: [managed-node16] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 April 2026 20:31:19 -0400 (0:00:01.255) 0:22:08.514 ********** skipping: [managed-node16] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node16] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 April 2026 20:31:20 -0400 (0:00:00.302) 0:22:08.816 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 April 2026 20:31:20 -0400 (0:00:00.098) 0:22:08.914 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 April 2026 20:31:20 -0400 (0:00:00.131) 0:22:09.046 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 April 2026 20:31:20 -0400 (0:00:00.131) 0:22:09.177 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Friday 17 April 2026 20:31:20 -0400 (0:00:00.107) 0:22:09.285 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 April 2026 20:31:20 -0400 (0:00:00.227) 0:22:09.512 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Friday 17 April 2026 20:31:20 -0400 (0:00:00.116) 0:22:09.629 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Friday 17 April 2026 20:31:20 -0400 (0:00:00.057) 0:22:09.687 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Friday 17 April 2026 20:31:24 -0400 (0:00:03.707) 0:22:13.394 ********** ok: [managed-node16] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 17 April 2026 20:31:24 -0400 (0:00:00.090) 0:22:13.485 ********** ok: [managed-node16] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 April 2026 20:31:24 -0400 (0:00:00.126) 0:22:13.612 ********** ok: [managed-node16] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Friday 17 April 2026 20:31:29 -0400 (0:00:04.958) 0:22:18.570 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 April 2026 20:31:30 -0400 (0:00:00.160) 0:22:18.731 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 April 2026 20:31:30 -0400 (0:00:00.030) 0:22:18.761 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 April 2026 20:31:30 -0400 (0:00:00.153) 0:22:18.915 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Friday 17 April 2026 20:31:30 -0400 (0:00:00.095) 0:22:19.010 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Friday 17 April 2026 20:31:33 -0400 (0:00:03.440) 0:22:22.451 ********** ok: [managed-node16] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Friday 17 April 2026 20:31:36 -0400 (0:00:02.474) 0:22:24.926 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Friday 17 April 2026 20:31:36 -0400 (0:00:00.367) 0:22:25.293 ********** changed: [managed-node16] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-ea8c9c9d-1ed3-42e9-9f63-505a344c8414", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-ea8c9c9d-1ed3-42e9-9f63-505a344c8414", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-ea8c9c9d-1ed3-42e9-9f63-505a344c8414", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-ea8c9c9d-1ed3-42e9-9f63-505a344c8414" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-ea8c9c9d-1ed3-42e9-9f63-505a344c8414", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-ea8c9c9d-1ed3-42e9-9f63-505a344c8414", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-ea8c9c9d-1ed3-42e9-9f63-505a344c8414", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Friday 17 April 2026 20:31:51 -0400 (0:00:14.470) 0:22:39.763 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Friday 17 April 2026 20:31:51 -0400 (0:00:00.153) 0:22:39.916 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776472177.7805333, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "a1522684f5b6a445a50f2611a4e0757a4aec1cf1", "ctime": 1776472177.7785332, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 419430537, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776472177.7785332, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1393, "uid": 0, "version": "571036750", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Friday 17 April 2026 20:31:52 -0400 (0:00:01.271) 0:22:41.188 ********** ok: [managed-node16] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Friday 17 April 2026 20:31:53 -0400 (0:00:01.488) 0:22:42.676 ********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Friday 17 April 2026 20:31:54 -0400 (0:00:00.242) 0:22:42.919 ********** ok: [managed-node16] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-ea8c9c9d-1ed3-42e9-9f63-505a344c8414", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-ea8c9c9d-1ed3-42e9-9f63-505a344c8414", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-ea8c9c9d-1ed3-42e9-9f63-505a344c8414", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-ea8c9c9d-1ed3-42e9-9f63-505a344c8414" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-ea8c9c9d-1ed3-42e9-9f63-505a344c8414", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-ea8c9c9d-1ed3-42e9-9f63-505a344c8414", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-ea8c9c9d-1ed3-42e9-9f63-505a344c8414", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Friday 17 April 2026 20:31:54 -0400 (0:00:00.158) 0:22:43.078 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-ea8c9c9d-1ed3-42e9-9f63-505a344c8414", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-ea8c9c9d-1ed3-42e9-9f63-505a344c8414", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Friday 17 April 2026 20:31:54 -0400 (0:00:00.161) 0:22:43.239 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Friday 17 April 2026 20:31:54 -0400 (0:00:00.137) 0:22:43.376 ********** changed: [managed-node16] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Friday 17 April 2026 20:31:56 -0400 (0:00:01.420) 0:22:44.797 ********** ok: [managed-node16] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Friday 17 April 2026 20:31:57 -0400 (0:00:01.777) 0:22:46.574 ********** changed: [managed-node16] => (item={'src': '/dev/mapper/luks-ea8c9c9d-1ed3-42e9-9f63-505a344c8414', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-ea8c9c9d-1ed3-42e9-9f63-505a344c8414", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-ea8c9c9d-1ed3-42e9-9f63-505a344c8414" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Friday 17 April 2026 20:31:59 -0400 (0:00:01.177) 0:22:47.752 ********** skipping: [managed-node16] => (item={'src': '/dev/mapper/luks-ea8c9c9d-1ed3-42e9-9f63-505a344c8414', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-ea8c9c9d-1ed3-42e9-9f63-505a344c8414", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Friday 17 April 2026 20:31:59 -0400 (0:00:00.315) 0:22:48.067 ********** ok: [managed-node16] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Friday 17 April 2026 20:32:00 -0400 (0:00:01.556) 0:22:49.623 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776472190.1215708, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1776472183.1915498, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 69206214, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1776472183.1905499, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "2650449719", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Friday 17 April 2026 20:32:01 -0400 (0:00:01.102) 0:22:50.726 ********** changed: [managed-node16] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-ea8c9c9d-1ed3-42e9-9f63-505a344c8414', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-ea8c9c9d-1ed3-42e9-9f63-505a344c8414", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Friday 17 April 2026 20:32:03 -0400 (0:00:01.405) 0:22:52.131 ********** ok: [managed-node16] TASK [Verify role results - 10] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:499 Friday 17 April 2026 20:32:05 -0400 (0:00:01.888) 0:22:54.019 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node16 TASK [Print out pool information] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Friday 17 April 2026 20:32:05 -0400 (0:00:00.529) 0:22:54.549 ********** ok: [managed-node16] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-ea8c9c9d-1ed3-42e9-9f63-505a344c8414", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-ea8c9c9d-1ed3-42e9-9f63-505a344c8414", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Friday 17 April 2026 20:32:06 -0400 (0:00:00.304) 0:22:54.853 ********** skipping: [managed-node16] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Friday 17 April 2026 20:32:06 -0400 (0:00:00.344) 0:22:55.197 ********** ok: [managed-node16] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "ea8c9c9d-1ed3-42e9-9f63-505a344c8414" }, "/dev/mapper/luks-ea8c9c9d-1ed3-42e9-9f63-505a344c8414": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-ea8c9c9d-1ed3-42e9-9f63-505a344c8414", "size": "4G", "type": "crypt", "uuid": "44090019-7afe-4719-8d08-b9422df3fdc3" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "MXMlX8-1TPL-giWv-f9GY-miKG-0a91-sT0DvI" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Friday 17 April 2026 20:32:08 -0400 (0:00:01.655) 0:22:56.852 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002576", "end": "2026-04-17 20:32:09.414434", "rc": 0, "start": "2026-04-17 20:32:09.411858" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-ea8c9c9d-1ed3-42e9-9f63-505a344c8414 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Friday 17 April 2026 20:32:09 -0400 (0:00:01.562) 0:22:58.414 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002434", "end": "2026-04-17 20:32:10.885280", "failed_when_result": false, "rc": 0, "start": "2026-04-17 20:32:10.882846" } STDOUT: luks-ea8c9c9d-1ed3-42e9-9f63-505a344c8414 /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Friday 17 April 2026 20:32:11 -0400 (0:00:01.450) 0:22:59.865 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node16 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Friday 17 April 2026 20:32:11 -0400 (0:00:00.409) 0:23:00.275 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Friday 17 April 2026 20:32:11 -0400 (0:00:00.248) 0:23:00.524 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.023991", "end": "2026-04-17 20:32:13.348125", "rc": 0, "start": "2026-04-17 20:32:13.324134" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Friday 17 April 2026 20:32:13 -0400 (0:00:01.830) 0:23:02.355 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Friday 17 April 2026 20:32:13 -0400 (0:00:00.337) 0:23:02.692 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node16 TASK [Set test variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Friday 17 April 2026 20:32:14 -0400 (0:00:00.665) 0:23:03.357 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Friday 17 April 2026 20:32:15 -0400 (0:00:00.411) 0:23:03.769 ********** ok: [managed-node16] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Friday 17 April 2026 20:32:16 -0400 (0:00:01.474) 0:23:05.243 ********** ok: [managed-node16] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Friday 17 April 2026 20:32:16 -0400 (0:00:00.261) 0:23:05.504 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Friday 17 April 2026 20:32:16 -0400 (0:00:00.199) 0:23:05.704 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Friday 17 April 2026 20:32:17 -0400 (0:00:00.145) 0:23:05.850 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Friday 17 April 2026 20:32:17 -0400 (0:00:00.163) 0:23:06.014 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Friday 17 April 2026 20:32:17 -0400 (0:00:00.179) 0:23:06.193 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Friday 17 April 2026 20:32:17 -0400 (0:00:00.288) 0:23:06.481 ********** ok: [managed-node16] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Friday 17 April 2026 20:32:18 -0400 (0:00:00.356) 0:23:06.838 ********** ok: [managed-node16] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.9.63 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Friday 17 April 2026 20:32:19 -0400 (0:00:01.411) 0:23:08.250 ********** skipping: [managed-node16] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Friday 17 April 2026 20:32:19 -0400 (0:00:00.259) 0:23:08.510 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node16 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Friday 17 April 2026 20:32:20 -0400 (0:00:00.343) 0:23:08.853 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Friday 17 April 2026 20:32:20 -0400 (0:00:00.217) 0:23:09.071 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Friday 17 April 2026 20:32:20 -0400 (0:00:00.204) 0:23:09.275 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Friday 17 April 2026 20:32:20 -0400 (0:00:00.217) 0:23:09.493 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Friday 17 April 2026 20:32:20 -0400 (0:00:00.155) 0:23:09.648 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Friday 17 April 2026 20:32:21 -0400 (0:00:00.196) 0:23:09.844 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Friday 17 April 2026 20:32:21 -0400 (0:00:00.279) 0:23:10.124 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Friday 17 April 2026 20:32:21 -0400 (0:00:00.257) 0:23:10.381 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Friday 17 April 2026 20:32:21 -0400 (0:00:00.176) 0:23:10.557 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Friday 17 April 2026 20:32:22 -0400 (0:00:00.299) 0:23:10.857 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Friday 17 April 2026 20:32:22 -0400 (0:00:00.221) 0:23:11.078 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Friday 17 April 2026 20:32:22 -0400 (0:00:00.127) 0:23:11.206 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node16 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Friday 17 April 2026 20:32:22 -0400 (0:00:00.327) 0:23:11.533 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node16 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Friday 17 April 2026 20:32:23 -0400 (0:00:00.393) 0:23:11.926 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Friday 17 April 2026 20:32:23 -0400 (0:00:00.292) 0:23:12.218 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Friday 17 April 2026 20:32:23 -0400 (0:00:00.245) 0:23:12.464 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Friday 17 April 2026 20:32:24 -0400 (0:00:00.316) 0:23:12.781 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Friday 17 April 2026 20:32:24 -0400 (0:00:00.215) 0:23:12.996 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Friday 17 April 2026 20:32:24 -0400 (0:00:00.282) 0:23:13.278 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Friday 17 April 2026 20:32:24 -0400 (0:00:00.191) 0:23:13.470 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Friday 17 April 2026 20:32:25 -0400 (0:00:00.323) 0:23:13.794 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node16 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Friday 17 April 2026 20:32:25 -0400 (0:00:00.365) 0:23:14.159 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node16 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Friday 17 April 2026 20:32:25 -0400 (0:00:00.458) 0:23:14.617 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Friday 17 April 2026 20:32:26 -0400 (0:00:00.377) 0:23:14.995 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Friday 17 April 2026 20:32:26 -0400 (0:00:00.192) 0:23:15.187 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Friday 17 April 2026 20:32:26 -0400 (0:00:00.181) 0:23:15.368 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Friday 17 April 2026 20:32:26 -0400 (0:00:00.165) 0:23:15.534 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node16 TASK [Set test variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Friday 17 April 2026 20:32:28 -0400 (0:00:01.324) 0:23:16.859 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Friday 17 April 2026 20:32:28 -0400 (0:00:00.285) 0:23:17.144 ********** skipping: [managed-node16] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Friday 17 April 2026 20:32:28 -0400 (0:00:00.347) 0:23:17.492 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node16 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Friday 17 April 2026 20:32:29 -0400 (0:00:00.443) 0:23:17.936 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Friday 17 April 2026 20:32:29 -0400 (0:00:00.234) 0:23:18.170 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Friday 17 April 2026 20:32:29 -0400 (0:00:00.275) 0:23:18.446 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Friday 17 April 2026 20:32:29 -0400 (0:00:00.224) 0:23:18.670 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Friday 17 April 2026 20:32:30 -0400 (0:00:00.297) 0:23:18.968 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Friday 17 April 2026 20:32:30 -0400 (0:00:00.211) 0:23:19.179 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Friday 17 April 2026 20:32:30 -0400 (0:00:00.224) 0:23:19.404 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Friday 17 April 2026 20:32:30 -0400 (0:00:00.256) 0:23:19.660 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node16 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Friday 17 April 2026 20:32:31 -0400 (0:00:00.463) 0:23:20.124 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node16 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Friday 17 April 2026 20:32:31 -0400 (0:00:00.425) 0:23:20.550 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Friday 17 April 2026 20:32:32 -0400 (0:00:00.191) 0:23:20.741 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Friday 17 April 2026 20:32:32 -0400 (0:00:00.176) 0:23:20.918 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Friday 17 April 2026 20:32:32 -0400 (0:00:00.208) 0:23:21.126 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off - 2] *********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Friday 17 April 2026 20:32:32 -0400 (0:00:00.164) 0:23:21.290 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on - 2] ************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Friday 17 April 2026 20:32:32 -0400 (0:00:00.163) 0:23:21.454 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Friday 17 April 2026 20:32:32 -0400 (0:00:00.235) 0:23:21.690 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Friday 17 April 2026 20:32:33 -0400 (0:00:00.214) 0:23:21.905 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node16 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Friday 17 April 2026 20:32:33 -0400 (0:00:00.484) 0:23:22.389 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Friday 17 April 2026 20:32:33 -0400 (0:00:00.226) 0:23:22.616 ********** skipping: [managed-node16] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Friday 17 April 2026 20:32:34 -0400 (0:00:00.246) 0:23:22.862 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Friday 17 April 2026 20:32:34 -0400 (0:00:00.213) 0:23:23.076 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Friday 17 April 2026 20:32:34 -0400 (0:00:00.180) 0:23:23.257 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Friday 17 April 2026 20:32:34 -0400 (0:00:00.221) 0:23:23.478 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Friday 17 April 2026 20:32:34 -0400 (0:00:00.118) 0:23:23.597 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Friday 17 April 2026 20:32:34 -0400 (0:00:00.125) 0:23:23.722 ********** ok: [managed-node16] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Friday 17 April 2026 20:32:35 -0400 (0:00:00.141) 0:23:23.864 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node16 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Friday 17 April 2026 20:32:35 -0400 (0:00:00.381) 0:23:24.245 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Friday 17 April 2026 20:32:35 -0400 (0:00:00.250) 0:23:24.495 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node16 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Friday 17 April 2026 20:32:36 -0400 (0:00:00.685) 0:23:25.180 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-ea8c9c9d-1ed3-42e9-9f63-505a344c8414" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Friday 17 April 2026 20:32:36 -0400 (0:00:00.174) 0:23:25.355 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Friday 17 April 2026 20:32:36 -0400 (0:00:00.235) 0:23:25.591 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Friday 17 April 2026 20:32:37 -0400 (0:00:00.257) 0:23:25.849 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Friday 17 April 2026 20:32:37 -0400 (0:00:00.121) 0:23:25.970 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Friday 17 April 2026 20:32:37 -0400 (0:00:00.175) 0:23:26.146 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Friday 17 April 2026 20:32:37 -0400 (0:00:00.218) 0:23:26.364 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Friday 17 April 2026 20:32:37 -0400 (0:00:00.244) 0:23:26.609 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Friday 17 April 2026 20:32:38 -0400 (0:00:00.199) 0:23:26.809 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Friday 17 April 2026 20:32:38 -0400 (0:00:00.134) 0:23:26.943 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Friday 17 April 2026 20:32:38 -0400 (0:00:00.317) 0:23:27.261 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Friday 17 April 2026 20:32:38 -0400 (0:00:00.167) 0:23:27.428 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-ea8c9c9d-1ed3-42e9-9f63-505a344c8414 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Friday 17 April 2026 20:32:39 -0400 (0:00:00.449) 0:23:27.878 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Friday 17 April 2026 20:32:39 -0400 (0:00:00.236) 0:23:28.115 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Friday 17 April 2026 20:32:39 -0400 (0:00:00.179) 0:23:28.294 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Friday 17 April 2026 20:32:39 -0400 (0:00:00.217) 0:23:28.512 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Friday 17 April 2026 20:32:40 -0400 (0:00:00.285) 0:23:28.797 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Friday 17 April 2026 20:32:40 -0400 (0:00:00.246) 0:23:29.044 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Friday 17 April 2026 20:32:40 -0400 (0:00:00.405) 0:23:29.449 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Friday 17 April 2026 20:32:41 -0400 (0:00:00.381) 0:23:29.830 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776472310.5369377, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776472310.5369377, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 300885, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776472310.5369377, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Friday 17 April 2026 20:32:42 -0400 (0:00:01.580) 0:23:31.411 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Friday 17 April 2026 20:32:42 -0400 (0:00:00.294) 0:23:31.705 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Friday 17 April 2026 20:32:43 -0400 (0:00:00.265) 0:23:31.970 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Friday 17 April 2026 20:32:43 -0400 (0:00:00.186) 0:23:32.157 ********** ok: [managed-node16] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Friday 17 April 2026 20:32:43 -0400 (0:00:00.085) 0:23:32.243 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Friday 17 April 2026 20:32:43 -0400 (0:00:00.123) 0:23:32.366 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Friday 17 April 2026 20:32:43 -0400 (0:00:00.167) 0:23:32.534 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776472310.6769383, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776472310.6769383, "dev": 6, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 320582, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776472310.6769383, "nlink": 1, "path": "/dev/mapper/luks-ea8c9c9d-1ed3-42e9-9f63-505a344c8414", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Friday 17 April 2026 20:32:44 -0400 (0:00:01.043) 0:23:33.578 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Friday 17 April 2026 20:32:48 -0400 (0:00:03.960) 0:23:37.538 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.010315", "end": "2026-04-17 20:32:50.140428", "rc": 0, "start": "2026-04-17 20:32:50.130113" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: ea8c9c9d-1ed3-42e9-9f63-505a344c8414 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 932695 Threads: 2 Salt: 75 fc 19 0a e4 b0 64 d9 ce 2a 71 77 2c e6 b8 14 3f 3d 84 4e bb 6d a2 a6 65 d0 a4 21 b4 7e 07 71 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 120249 Salt: f9 65 3c d3 01 fc 8d e9 34 ef 58 82 01 70 6e a8 c5 88 92 33 14 42 d9 e2 e5 dc 24 ea b7 47 0c 14 Digest: c3 d4 58 4d c3 11 15 b0 3c 07 2e 59 1f 1b f5 11 1b c3 c5 a4 96 b0 77 8c de ce ee 02 cd ed 41 ee TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Friday 17 April 2026 20:32:50 -0400 (0:00:01.573) 0:23:39.111 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Friday 17 April 2026 20:32:50 -0400 (0:00:00.140) 0:23:39.251 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Friday 17 April 2026 20:32:50 -0400 (0:00:00.240) 0:23:39.492 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Friday 17 April 2026 20:32:50 -0400 (0:00:00.157) 0:23:39.649 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Friday 17 April 2026 20:32:51 -0400 (0:00:00.219) 0:23:39.869 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Friday 17 April 2026 20:32:51 -0400 (0:00:00.220) 0:23:40.089 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Friday 17 April 2026 20:32:51 -0400 (0:00:00.193) 0:23:40.283 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Friday 17 April 2026 20:32:51 -0400 (0:00:00.096) 0:23:40.380 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-ea8c9c9d-1ed3-42e9-9f63-505a344c8414 /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Friday 17 April 2026 20:32:51 -0400 (0:00:00.209) 0:23:40.590 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Friday 17 April 2026 20:32:52 -0400 (0:00:00.234) 0:23:40.824 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Friday 17 April 2026 20:32:52 -0400 (0:00:00.222) 0:23:41.047 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Friday 17 April 2026 20:32:52 -0400 (0:00:00.232) 0:23:41.280 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Friday 17 April 2026 20:32:52 -0400 (0:00:00.205) 0:23:41.485 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Friday 17 April 2026 20:32:52 -0400 (0:00:00.217) 0:23:41.702 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Friday 17 April 2026 20:32:53 -0400 (0:00:00.160) 0:23:41.862 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Friday 17 April 2026 20:32:53 -0400 (0:00:00.158) 0:23:42.021 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Friday 17 April 2026 20:32:53 -0400 (0:00:00.239) 0:23:42.260 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Friday 17 April 2026 20:32:53 -0400 (0:00:00.156) 0:23:42.417 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Friday 17 April 2026 20:32:53 -0400 (0:00:00.179) 0:23:42.597 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Friday 17 April 2026 20:32:54 -0400 (0:00:00.203) 0:23:42.800 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Friday 17 April 2026 20:32:54 -0400 (0:00:00.245) 0:23:43.046 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Friday 17 April 2026 20:32:54 -0400 (0:00:00.167) 0:23:43.214 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Friday 17 April 2026 20:32:54 -0400 (0:00:00.210) 0:23:43.425 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Friday 17 April 2026 20:32:54 -0400 (0:00:00.216) 0:23:43.642 ********** ok: [managed-node16] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Friday 17 April 2026 20:32:56 -0400 (0:00:01.575) 0:23:45.217 ********** ok: [managed-node16] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Friday 17 April 2026 20:32:58 -0400 (0:00:01.692) 0:23:46.909 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Friday 17 April 2026 20:32:58 -0400 (0:00:00.317) 0:23:47.226 ********** ok: [managed-node16] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Friday 17 April 2026 20:32:58 -0400 (0:00:00.224) 0:23:47.450 ********** ok: [managed-node16] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Friday 17 April 2026 20:33:00 -0400 (0:00:01.504) 0:23:48.955 ********** skipping: [managed-node16] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Friday 17 April 2026 20:33:00 -0400 (0:00:00.282) 0:23:49.238 ********** skipping: [managed-node16] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Friday 17 April 2026 20:33:00 -0400 (0:00:00.191) 0:23:49.430 ********** skipping: [managed-node16] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Friday 17 April 2026 20:33:00 -0400 (0:00:00.235) 0:23:49.666 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Friday 17 April 2026 20:33:01 -0400 (0:00:00.395) 0:23:50.061 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Friday 17 April 2026 20:33:01 -0400 (0:00:00.412) 0:23:50.473 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Friday 17 April 2026 20:33:02 -0400 (0:00:00.285) 0:23:50.759 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Friday 17 April 2026 20:33:02 -0400 (0:00:00.289) 0:23:51.049 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Friday 17 April 2026 20:33:02 -0400 (0:00:00.207) 0:23:51.256 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Friday 17 April 2026 20:33:02 -0400 (0:00:00.308) 0:23:51.564 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Friday 17 April 2026 20:33:03 -0400 (0:00:00.211) 0:23:51.776 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Friday 17 April 2026 20:33:03 -0400 (0:00:00.238) 0:23:52.015 ********** skipping: [managed-node16] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Friday 17 April 2026 20:33:03 -0400 (0:00:00.244) 0:23:52.259 ********** skipping: [managed-node16] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Friday 17 April 2026 20:33:03 -0400 (0:00:00.177) 0:23:52.437 ********** skipping: [managed-node16] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Friday 17 April 2026 20:33:03 -0400 (0:00:00.206) 0:23:52.643 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Friday 17 April 2026 20:33:04 -0400 (0:00:00.223) 0:23:52.867 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Friday 17 April 2026 20:33:04 -0400 (0:00:00.202) 0:23:53.070 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Friday 17 April 2026 20:33:04 -0400 (0:00:00.261) 0:23:53.332 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Friday 17 April 2026 20:33:04 -0400 (0:00:00.147) 0:23:53.480 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Friday 17 April 2026 20:33:05 -0400 (0:00:00.282) 0:23:53.762 ********** ok: [managed-node16] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Friday 17 April 2026 20:33:05 -0400 (0:00:00.233) 0:23:53.995 ********** ok: [managed-node16] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Friday 17 April 2026 20:33:05 -0400 (0:00:00.208) 0:23:54.204 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Friday 17 April 2026 20:33:05 -0400 (0:00:00.271) 0:23:54.475 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.023967", "end": "2026-04-17 20:33:06.899164", "rc": 0, "start": "2026-04-17 20:33:06.875197" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Friday 17 April 2026 20:33:06 -0400 (0:00:01.245) 0:23:55.721 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Friday 17 April 2026 20:33:07 -0400 (0:00:00.096) 0:23:55.817 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Friday 17 April 2026 20:33:07 -0400 (0:00:00.125) 0:23:55.942 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Friday 17 April 2026 20:33:07 -0400 (0:00:00.052) 0:23:55.994 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Friday 17 April 2026 20:33:07 -0400 (0:00:00.046) 0:23:56.040 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Friday 17 April 2026 20:33:07 -0400 (0:00:00.068) 0:23:56.109 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Friday 17 April 2026 20:33:07 -0400 (0:00:00.099) 0:23:56.209 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Friday 17 April 2026 20:33:07 -0400 (0:00:00.182) 0:23:56.391 ********** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Friday 17 April 2026 20:33:07 -0400 (0:00:00.192) 0:23:56.584 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:502 Friday 17 April 2026 20:33:07 -0400 (0:00:00.134) 0:23:56.718 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node16 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Friday 17 April 2026 20:33:08 -0400 (0:00:00.351) 0:23:57.070 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Friday 17 April 2026 20:33:08 -0400 (0:00:00.209) 0:23:57.279 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 April 2026 20:33:08 -0400 (0:00:00.184) 0:23:57.464 ********** ok: [managed-node16] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Friday 17 April 2026 20:33:10 -0400 (0:00:01.397) 0:23:58.861 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 April 2026 20:33:10 -0400 (0:00:00.113) 0:23:58.975 ********** ok: [managed-node16] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 April 2026 20:33:13 -0400 (0:00:03.115) 0:24:02.091 ********** skipping: [managed-node16] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node16] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 April 2026 20:33:14 -0400 (0:00:00.932) 0:24:03.023 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 April 2026 20:33:14 -0400 (0:00:00.143) 0:24:03.166 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 April 2026 20:33:14 -0400 (0:00:00.176) 0:24:03.343 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 April 2026 20:33:14 -0400 (0:00:00.141) 0:24:03.485 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Friday 17 April 2026 20:33:14 -0400 (0:00:00.092) 0:24:03.577 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 April 2026 20:33:15 -0400 (0:00:00.237) 0:24:03.814 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Friday 17 April 2026 20:33:15 -0400 (0:00:00.141) 0:24:03.956 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Friday 17 April 2026 20:33:15 -0400 (0:00:00.061) 0:24:04.018 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Friday 17 April 2026 20:33:18 -0400 (0:00:03.623) 0:24:07.642 ********** ok: [managed-node16] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 17 April 2026 20:33:19 -0400 (0:00:00.258) 0:24:07.901 ********** ok: [managed-node16] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "name": "foo", "state": "absent", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 April 2026 20:33:19 -0400 (0:00:00.189) 0:24:08.091 ********** ok: [managed-node16] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Friday 17 April 2026 20:33:24 -0400 (0:00:05.207) 0:24:13.298 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 April 2026 20:33:24 -0400 (0:00:00.258) 0:24:13.557 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 April 2026 20:33:24 -0400 (0:00:00.099) 0:24:13.657 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 April 2026 20:33:25 -0400 (0:00:00.173) 0:24:13.830 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Friday 17 April 2026 20:33:25 -0400 (0:00:00.185) 0:24:14.015 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Friday 17 April 2026 20:33:29 -0400 (0:00:04.022) 0:24:18.038 ********** ok: [managed-node16] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Friday 17 April 2026 20:33:31 -0400 (0:00:02.560) 0:24:20.599 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Friday 17 April 2026 20:33:32 -0400 (0:00:00.262) 0:24:20.861 ********** changed: [managed-node16] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-ea8c9c9d-1ed3-42e9-9f63-505a344c8414", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-ea8c9c9d-1ed3-42e9-9f63-505a344c8414", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-ea8c9c9d-1ed3-42e9-9f63-505a344c8414", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-ea8c9c9d-1ed3-42e9-9f63-505a344c8414", "state": "absent" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=MXMlX8-1TPL-giWv-f9GY-miKG-0a91-sT0DvI", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Friday 17 April 2026 20:33:38 -0400 (0:00:06.415) 0:24:27.276 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Friday 17 April 2026 20:33:38 -0400 (0:00:00.206) 0:24:27.483 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776472318.779963, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "e5b96ce077c2f1a44f76c00ded748f202074b48d", "ctime": 1776472318.776963, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 419430537, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776472318.776963, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "571036750", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Friday 17 April 2026 20:33:40 -0400 (0:00:01.430) 0:24:28.913 ********** ok: [managed-node16] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Friday 17 April 2026 20:33:41 -0400 (0:00:01.321) 0:24:30.235 ********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Friday 17 April 2026 20:33:41 -0400 (0:00:00.279) 0:24:30.514 ********** ok: [managed-node16] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-ea8c9c9d-1ed3-42e9-9f63-505a344c8414", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-ea8c9c9d-1ed3-42e9-9f63-505a344c8414", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-ea8c9c9d-1ed3-42e9-9f63-505a344c8414", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-ea8c9c9d-1ed3-42e9-9f63-505a344c8414", "state": "absent" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=MXMlX8-1TPL-giWv-f9GY-miKG-0a91-sT0DvI", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Friday 17 April 2026 20:33:42 -0400 (0:00:00.212) 0:24:30.727 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Friday 17 April 2026 20:33:42 -0400 (0:00:00.282) 0:24:31.009 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=MXMlX8-1TPL-giWv-f9GY-miKG-0a91-sT0DvI", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Friday 17 April 2026 20:33:42 -0400 (0:00:00.184) 0:24:31.194 ********** changed: [managed-node16] => (item={'src': '/dev/mapper/luks-ea8c9c9d-1ed3-42e9-9f63-505a344c8414', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-ea8c9c9d-1ed3-42e9-9f63-505a344c8414", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-ea8c9c9d-1ed3-42e9-9f63-505a344c8414" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Friday 17 April 2026 20:33:43 -0400 (0:00:01.316) 0:24:32.510 ********** ok: [managed-node16] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Friday 17 April 2026 20:33:45 -0400 (0:00:01.832) 0:24:34.343 ********** TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Friday 17 April 2026 20:33:45 -0400 (0:00:00.254) 0:24:34.597 ********** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Friday 17 April 2026 20:33:46 -0400 (0:00:00.326) 0:24:34.924 ********** ok: [managed-node16] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Friday 17 April 2026 20:33:48 -0400 (0:00:01.963) 0:24:36.888 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776472330.8839998, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "5c9cee21e6ba73cd4d6b925757b4b441008c089c", "ctime": 1776472323.1769764, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 241172696, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1776472323.1759763, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "3618598809", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Friday 17 April 2026 20:33:49 -0400 (0:00:01.431) 0:24:38.319 ********** changed: [managed-node16] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-ea8c9c9d-1ed3-42e9-9f63-505a344c8414', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-ea8c9c9d-1ed3-42e9-9f63-505a344c8414", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Friday 17 April 2026 20:33:51 -0400 (0:00:01.674) 0:24:39.994 ********** ok: [managed-node16] TASK [Verify role results - 11] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:511 Friday 17 April 2026 20:33:52 -0400 (0:00:01.615) 0:24:41.609 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node16 TASK [Print out pool information] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Friday 17 April 2026 20:33:53 -0400 (0:00:00.253) 0:24:41.863 ********** skipping: [managed-node16] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Friday 17 April 2026 20:33:53 -0400 (0:00:00.118) 0:24:41.981 ********** ok: [managed-node16] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=MXMlX8-1TPL-giWv-f9GY-miKG-0a91-sT0DvI", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Friday 17 April 2026 20:33:53 -0400 (0:00:00.180) 0:24:42.161 ********** ok: [managed-node16] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Friday 17 April 2026 20:33:54 -0400 (0:00:01.111) 0:24:43.273 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002792", "end": "2026-04-17 20:33:55.629083", "rc": 0, "start": "2026-04-17 20:33:55.626291" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Friday 17 April 2026 20:33:55 -0400 (0:00:01.409) 0:24:44.682 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002178", "end": "2026-04-17 20:33:57.308116", "failed_when_result": false, "rc": 0, "start": "2026-04-17 20:33:57.305938" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Friday 17 April 2026 20:33:57 -0400 (0:00:01.653) 0:24:46.336 ********** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Friday 17 April 2026 20:33:57 -0400 (0:00:00.213) 0:24:46.549 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node16 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Friday 17 April 2026 20:33:58 -0400 (0:00:00.389) 0:24:46.939 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Friday 17 April 2026 20:33:58 -0400 (0:00:00.266) 0:24:47.205 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node16 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node16 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Friday 17 April 2026 20:33:59 -0400 (0:00:01.023) 0:24:48.229 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Friday 17 April 2026 20:33:59 -0400 (0:00:00.326) 0:24:48.556 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Friday 17 April 2026 20:34:00 -0400 (0:00:00.234) 0:24:48.790 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Friday 17 April 2026 20:34:00 -0400 (0:00:00.185) 0:24:48.976 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Friday 17 April 2026 20:34:00 -0400 (0:00:00.085) 0:24:49.062 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Friday 17 April 2026 20:34:00 -0400 (0:00:00.107) 0:24:49.169 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Friday 17 April 2026 20:34:00 -0400 (0:00:00.172) 0:24:49.342 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Friday 17 April 2026 20:34:00 -0400 (0:00:00.098) 0:24:49.440 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Friday 17 April 2026 20:34:00 -0400 (0:00:00.177) 0:24:49.618 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Friday 17 April 2026 20:34:01 -0400 (0:00:00.217) 0:24:49.835 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Friday 17 April 2026 20:34:01 -0400 (0:00:00.164) 0:24:50.000 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Friday 17 April 2026 20:34:01 -0400 (0:00:00.273) 0:24:50.273 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Friday 17 April 2026 20:34:01 -0400 (0:00:00.398) 0:24:50.672 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Friday 17 April 2026 20:34:02 -0400 (0:00:00.226) 0:24:50.898 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Friday 17 April 2026 20:34:02 -0400 (0:00:00.143) 0:24:51.042 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Friday 17 April 2026 20:34:02 -0400 (0:00:00.094) 0:24:51.137 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Friday 17 April 2026 20:34:02 -0400 (0:00:00.255) 0:24:51.392 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Friday 17 April 2026 20:34:02 -0400 (0:00:00.198) 0:24:51.591 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Friday 17 April 2026 20:34:03 -0400 (0:00:00.251) 0:24:51.842 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Friday 17 April 2026 20:34:03 -0400 (0:00:00.180) 0:24:52.023 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776472418.189266, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776472418.189266, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 36719, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1776472418.189266, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Friday 17 April 2026 20:34:04 -0400 (0:00:01.186) 0:24:53.209 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Friday 17 April 2026 20:34:04 -0400 (0:00:00.269) 0:24:53.479 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Friday 17 April 2026 20:34:04 -0400 (0:00:00.198) 0:24:53.678 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Friday 17 April 2026 20:34:05 -0400 (0:00:00.168) 0:24:53.847 ********** ok: [managed-node16] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Friday 17 April 2026 20:34:05 -0400 (0:00:00.218) 0:24:54.065 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Friday 17 April 2026 20:34:05 -0400 (0:00:00.201) 0:24:54.266 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Friday 17 April 2026 20:34:05 -0400 (0:00:00.118) 0:24:54.385 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Friday 17 April 2026 20:34:05 -0400 (0:00:00.138) 0:24:54.523 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Friday 17 April 2026 20:34:09 -0400 (0:00:04.181) 0:24:58.705 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Friday 17 April 2026 20:34:10 -0400 (0:00:00.205) 0:24:58.911 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Friday 17 April 2026 20:34:10 -0400 (0:00:00.167) 0:24:59.078 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Friday 17 April 2026 20:34:10 -0400 (0:00:00.181) 0:24:59.260 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Friday 17 April 2026 20:34:10 -0400 (0:00:00.213) 0:24:59.473 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Friday 17 April 2026 20:34:10 -0400 (0:00:00.179) 0:24:59.653 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Friday 17 April 2026 20:34:11 -0400 (0:00:00.133) 0:24:59.786 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Friday 17 April 2026 20:34:11 -0400 (0:00:00.188) 0:24:59.975 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Friday 17 April 2026 20:34:11 -0400 (0:00:00.186) 0:25:00.162 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Friday 17 April 2026 20:34:11 -0400 (0:00:00.207) 0:25:00.369 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Friday 17 April 2026 20:34:11 -0400 (0:00:00.187) 0:25:00.556 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Friday 17 April 2026 20:34:11 -0400 (0:00:00.161) 0:25:00.718 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Friday 17 April 2026 20:34:12 -0400 (0:00:00.127) 0:25:00.845 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Friday 17 April 2026 20:34:12 -0400 (0:00:00.219) 0:25:01.064 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Friday 17 April 2026 20:34:12 -0400 (0:00:00.194) 0:25:01.259 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Friday 17 April 2026 20:34:12 -0400 (0:00:00.203) 0:25:01.463 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Friday 17 April 2026 20:34:12 -0400 (0:00:00.138) 0:25:01.602 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Friday 17 April 2026 20:34:13 -0400 (0:00:00.289) 0:25:01.891 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Friday 17 April 2026 20:34:13 -0400 (0:00:00.234) 0:25:02.126 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Friday 17 April 2026 20:34:13 -0400 (0:00:00.216) 0:25:02.342 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Friday 17 April 2026 20:34:13 -0400 (0:00:00.183) 0:25:02.525 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Friday 17 April 2026 20:34:13 -0400 (0:00:00.183) 0:25:02.708 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Friday 17 April 2026 20:34:14 -0400 (0:00:00.291) 0:25:03.000 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Friday 17 April 2026 20:34:14 -0400 (0:00:00.106) 0:25:03.106 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Friday 17 April 2026 20:34:14 -0400 (0:00:00.358) 0:25:03.465 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Friday 17 April 2026 20:34:14 -0400 (0:00:00.246) 0:25:03.712 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Friday 17 April 2026 20:34:15 -0400 (0:00:00.192) 0:25:03.905 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Friday 17 April 2026 20:34:15 -0400 (0:00:00.220) 0:25:04.125 ********** ok: [managed-node16] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Friday 17 April 2026 20:34:15 -0400 (0:00:00.320) 0:25:04.445 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Friday 17 April 2026 20:34:15 -0400 (0:00:00.216) 0:25:04.662 ********** skipping: [managed-node16] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Friday 17 April 2026 20:34:16 -0400 (0:00:00.255) 0:25:04.918 ********** skipping: [managed-node16] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Friday 17 April 2026 20:34:16 -0400 (0:00:00.236) 0:25:05.154 ********** skipping: [managed-node16] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Friday 17 April 2026 20:34:16 -0400 (0:00:00.299) 0:25:05.453 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Friday 17 April 2026 20:34:17 -0400 (0:00:00.341) 0:25:05.795 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Friday 17 April 2026 20:34:17 -0400 (0:00:00.236) 0:25:06.031 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Friday 17 April 2026 20:34:17 -0400 (0:00:00.195) 0:25:06.226 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Friday 17 April 2026 20:34:17 -0400 (0:00:00.260) 0:25:06.487 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Friday 17 April 2026 20:34:17 -0400 (0:00:00.186) 0:25:06.674 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Friday 17 April 2026 20:34:18 -0400 (0:00:00.149) 0:25:06.823 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Friday 17 April 2026 20:34:18 -0400 (0:00:00.170) 0:25:06.993 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Friday 17 April 2026 20:34:18 -0400 (0:00:00.233) 0:25:07.227 ********** skipping: [managed-node16] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Friday 17 April 2026 20:34:18 -0400 (0:00:00.283) 0:25:07.510 ********** skipping: [managed-node16] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Friday 17 April 2026 20:34:19 -0400 (0:00:00.278) 0:25:07.788 ********** skipping: [managed-node16] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Friday 17 April 2026 20:34:19 -0400 (0:00:00.159) 0:25:07.948 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Friday 17 April 2026 20:34:19 -0400 (0:00:00.225) 0:25:08.173 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Friday 17 April 2026 20:34:19 -0400 (0:00:00.289) 0:25:08.462 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Friday 17 April 2026 20:34:20 -0400 (0:00:00.296) 0:25:08.759 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Friday 17 April 2026 20:34:20 -0400 (0:00:00.284) 0:25:09.043 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Friday 17 April 2026 20:34:20 -0400 (0:00:00.261) 0:25:09.304 ********** ok: [managed-node16] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Friday 17 April 2026 20:34:20 -0400 (0:00:00.388) 0:25:09.693 ********** ok: [managed-node16] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Friday 17 April 2026 20:34:21 -0400 (0:00:00.290) 0:25:09.983 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Friday 17 April 2026 20:34:21 -0400 (0:00:00.182) 0:25:10.166 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Friday 17 April 2026 20:34:21 -0400 (0:00:00.174) 0:25:10.340 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Friday 17 April 2026 20:34:21 -0400 (0:00:00.257) 0:25:10.598 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Friday 17 April 2026 20:34:22 -0400 (0:00:00.225) 0:25:10.824 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Friday 17 April 2026 20:34:22 -0400 (0:00:00.335) 0:25:11.159 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Friday 17 April 2026 20:34:22 -0400 (0:00:00.192) 0:25:11.351 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Friday 17 April 2026 20:34:22 -0400 (0:00:00.191) 0:25:11.543 ********** skipping: [managed-node16] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Friday 17 April 2026 20:34:23 -0400 (0:00:00.232) 0:25:11.776 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Friday 17 April 2026 20:34:23 -0400 (0:00:00.161) 0:25:11.937 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* managed-node16 : ok=1266 changed=60 unreachable=0 failed=9 skipped=1115 rescued=9 ignored=0 SYSTEM ROLES ERRORS BEGIN v1 [ { "ansible_version": "2.9.27", "end_time": "2026-04-18T00:10:32.832664+00:00Z", "host": "managed-node16", "message": "encrypted volume 'foo' missing key/password", "start_time": "2026-04-18T00:10:27.295130+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.9.27", "end_time": "2026-04-18T00:10:33.020580+00:00Z", "host": "managed-node16", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } }, "leaves": [], "mounts": [], "msg": "encrypted volume 'foo' missing key/password", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-04-18T00:10:32.838709+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" }, { "ansible_version": "2.9.27", "end_time": "2026-04-18T00:12:43.712683+00:00Z", "host": "managed-node16", "message": "cannot remove existing formatting on device 'luks-8b0f9973-6fca-4d87-b62f-c32a1cc12a52' in safe mode due to encryption removal", "start_time": "2026-04-18T00:12:38.247088+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.9.27", "end_time": "2026-04-18T00:12:44.009777+00:00Z", "host": "managed-node16", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'luks-8b0f9973-6fca-4d87-b62f-c32a1cc12a52' in safe mode due to encryption removal", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-04-18T00:12:43.719611+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" }, { "ansible_version": "2.9.27", "end_time": "2026-04-18T00:14:39.535116+00:00Z", "host": "managed-node16", "message": "cannot remove existing formatting on device 'sda' in safe mode due to adding encryption", "start_time": "2026-04-18T00:14:35.113566+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.9.27", "end_time": "2026-04-18T00:14:39.652437+00:00Z", "host": "managed-node16", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'sda' in safe mode due to adding encryption", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-04-18T00:14:39.553816+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" }, { "ansible_version": "2.9.27", "end_time": "2026-04-18T00:16:42.981323+00:00Z", "host": "managed-node16", "message": "encrypted volume 'test1' missing key/password", "start_time": "2026-04-18T00:16:37.630242+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.9.27", "end_time": "2026-04-18T00:16:43.201346+00:00Z", "host": "managed-node16", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "safe_mode": false, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "encrypted volume 'test1' missing key/password", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-04-18T00:16:43.013496+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" }, { "ansible_version": "2.9.27", "end_time": "2026-04-18T00:19:13.047854+00:00Z", "host": "managed-node16", "message": "cannot remove existing formatting on device 'luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798' in safe mode due to encryption removal", "start_time": "2026-04-18T00:19:07.940574+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.9.27", "end_time": "2026-04-18T00:19:13.310406+00:00Z", "host": "managed-node16", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'luks-2f0698e3-f8a3-4aa6-9d24-bac03448f798' in safe mode due to encryption removal", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-04-18T00:19:13.097873+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" }, { "ansible_version": "2.9.27", "end_time": "2026-04-18T00:21:40.860866+00:00Z", "host": "managed-node16", "message": "cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption", "start_time": "2026-04-18T00:21:35.838132+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.9.27", "end_time": "2026-04-18T00:21:41.197554+00:00Z", "host": "managed-node16", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-04-18T00:21:40.908512+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" }, { "ansible_version": "2.9.27", "end_time": "2026-04-18T00:24:10.961066+00:00Z", "host": "managed-node16", "message": "encrypted volume 'test1' missing key/password", "start_time": "2026-04-18T00:24:05.831135+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.9.27", "end_time": "2026-04-18T00:24:11.259971+00:00Z", "host": "managed-node16", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "safe_mode": false, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "encrypted volume 'test1' missing key/password", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-04-18T00:24:11.040805+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" }, { "ansible_version": "2.9.27", "end_time": "2026-04-18T00:28:45.848152+00:00Z", "host": "managed-node16", "message": "cannot remove existing formatting on device 'luks-30ec6405-b599-4e78-a6b8-ead3f359c08c' in safe mode due to encryption removal", "start_time": "2026-04-18T00:28:40.189783+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.9.27", "end_time": "2026-04-18T00:28:46.092846+00:00Z", "host": "managed-node16", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'luks-30ec6405-b599-4e78-a6b8-ead3f359c08c' in safe mode due to encryption removal", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-04-18T00:28:45.890661+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" }, { "ansible_version": "2.9.27", "end_time": "2026-04-18T00:31:12.753423+00:00Z", "host": "managed-node16", "message": "cannot remove existing formatting on device 'test1' in safe mode due to adding encryption", "start_time": "2026-04-18T00:31:07.468700+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.9.27", "end_time": "2026-04-18T00:31:12.948329+00:00Z", "host": "managed-node16", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'test1' in safe mode due to adding encryption", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-04-18T00:31:12.785968+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" } ] SYSTEM ROLES ERRORS END v1 TASKS RECAP ******************************************************************** Friday 17 April 2026 20:34:23 -0400 (0:00:00.298) 0:25:12.236 ********** =============================================================================== fedora.linux_system_roles.storage : Record storage role fingerprint in syslog -- 36.19s /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 14.47s /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 14.27s /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 14.02s /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 13.47s /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 13.31s /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 11.75s /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Include the appropriate provider tasks --- 9.18s /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 6.42s /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Get required packages --------------- 5.84s /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.67s /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Get required packages --------------- 5.66s /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.65s /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.55s /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.54s /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.48s /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Get required packages --------------- 5.44s /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 fedora.linux_system_roles.storage : Make sure blivet is available ------- 5.42s /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 fedora.linux_system_roles.storage : Get required packages --------------- 5.41s /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.39s /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88