ansible-playbook 2.9.27 config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.9/site-packages/ansible executable location = /usr/local/bin/ansible-playbook python version = 3.9.19 (main, May 16 2024, 11:40:09) [GCC 8.5.0 20210514 (Red Hat 8.5.0-22)] No config file found; using defaults [WARNING]: running playbook inside collection fedora.linux_system_roles statically imported: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'jsonl', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: tests_luks.yml ******************************************************* 1 plays in /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml PLAY [Test LUKS] *************************************************************** TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:2 Tuesday 19 August 2025 14:16:12 -0400 (0:00:00.268) 0:00:00.268 ******** ok: [managed-node11] META: ran handlers TASK [Enable FIPS mode] ******************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:20 Tuesday 19 August 2025 14:16:16 -0400 (0:00:03.900) 0:00:04.168 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reboot] ****************************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:28 Tuesday 19 August 2025 14:16:16 -0400 (0:00:00.135) 0:00:04.304 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Enable FIPS mode - 2] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:39 Tuesday 19 August 2025 14:16:16 -0400 (0:00:00.207) 0:00:04.512 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reboot - 2] ************************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:43 Tuesday 19 August 2025 14:16:16 -0400 (0:00:00.257) 0:00:04.770 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure dracut-fips] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:53 Tuesday 19 August 2025 14:16:16 -0400 (0:00:00.179) 0:00:04.949 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Configure boot for FIPS] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:59 Tuesday 19 August 2025 14:16:17 -0400 (0:00:00.329) 0:00:05.279 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reboot - 3] ************************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:68 Tuesday 19 August 2025 14:16:17 -0400 (0:00:00.509) 0:00:05.789 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role] ************************************************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:72 Tuesday 19 August 2025 14:16:18 -0400 (0:00:00.479) 0:00:06.268 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Tuesday 19 August 2025 14:16:18 -0400 (0:00:00.369) 0:00:06.637 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Tuesday 19 August 2025 14:16:18 -0400 (0:00:00.344) 0:00:06.982 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Tuesday 19 August 2025 14:16:19 -0400 (0:00:00.423) 0:00:07.405 ******** skipping: [managed-node11] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node11] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Tuesday 19 August 2025 14:16:19 -0400 (0:00:00.553) 0:00:07.959 ******** ok: [managed-node11] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Tuesday 19 August 2025 14:16:21 -0400 (0:00:01.908) 0:00:09.867 ******** ok: [managed-node11] => { "ansible_facts": { "__storage_is_ostree": false }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Tuesday 19 August 2025 14:16:21 -0400 (0:00:00.192) 0:00:10.060 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Tuesday 19 August 2025 14:16:22 -0400 (0:00:00.148) 0:00:10.208 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Tuesday 19 August 2025 14:16:22 -0400 (0:00:00.119) 0:00:10.327 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Tuesday 19 August 2025 14:16:22 -0400 (0:00:00.598) 0:00:10.925 ******** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Tuesday 19 August 2025 14:16:28 -0400 (0:00:05.594) 0:00:16.520 ******** ok: [managed-node11] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Tuesday 19 August 2025 14:16:28 -0400 (0:00:00.321) 0:00:16.842 ******** ok: [managed-node11] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Tuesday 19 August 2025 14:16:29 -0400 (0:00:00.422) 0:00:17.265 ******** ok: [managed-node11] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Tuesday 19 August 2025 14:16:32 -0400 (0:00:03.267) 0:00:20.533 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Tuesday 19 August 2025 14:16:33 -0400 (0:00:00.698) 0:00:21.231 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Tuesday 19 August 2025 14:16:33 -0400 (0:00:00.117) 0:00:21.349 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Tuesday 19 August 2025 14:16:33 -0400 (0:00:00.207) 0:00:21.557 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Tuesday 19 August 2025 14:16:33 -0400 (0:00:00.115) 0:00:21.672 ******** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Tuesday 19 August 2025 14:16:37 -0400 (0:00:04.469) 0:00:26.142 ******** ok: [managed-node11] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Tuesday 19 August 2025 14:16:41 -0400 (0:00:03.878) 0:00:30.020 ******** ok: [managed-node11] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Tuesday 19 August 2025 14:16:42 -0400 (0:00:00.558) 0:00:30.579 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Tuesday 19 August 2025 14:16:42 -0400 (0:00:00.223) 0:00:30.802 ******** ok: [managed-node11] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Tuesday 19 August 2025 14:16:44 -0400 (0:00:01.536) 0:00:32.339 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Tuesday 19 August 2025 14:16:44 -0400 (0:00:00.186) 0:00:32.525 ******** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1755626960.1106112, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "ab8070345774adad92683e9645714452be7be474", "ctime": 1755626958.2476, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 364904586, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1755626958.2476, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1343, "uid": 0, "version": "2238694571", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Tuesday 19 August 2025 14:16:45 -0400 (0:00:01.251) 0:00:33.777 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Tuesday 19 August 2025 14:16:45 -0400 (0:00:00.253) 0:00:34.030 ******** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Tuesday 19 August 2025 14:16:46 -0400 (0:00:00.137) 0:00:34.168 ******** ok: [managed-node11] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Tuesday 19 August 2025 14:16:46 -0400 (0:00:00.163) 0:00:34.331 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Tuesday 19 August 2025 14:16:46 -0400 (0:00:00.224) 0:00:34.556 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Tuesday 19 August 2025 14:16:46 -0400 (0:00:00.127) 0:00:34.683 ******** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Tuesday 19 August 2025 14:16:46 -0400 (0:00:00.173) 0:00:34.857 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Tuesday 19 August 2025 14:16:46 -0400 (0:00:00.190) 0:00:35.047 ******** TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Tuesday 19 August 2025 14:16:47 -0400 (0:00:00.222) 0:00:35.270 ******** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Tuesday 19 August 2025 14:16:47 -0400 (0:00:00.199) 0:00:35.470 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Tuesday 19 August 2025 14:16:47 -0400 (0:00:00.238) 0:00:35.708 ******** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1755626059.7976253, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1716968941.893, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 135, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1716968586.525, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "1157759751", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Tuesday 19 August 2025 14:16:49 -0400 (0:00:01.538) 0:00:37.247 ******** TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Tuesday 19 August 2025 14:16:49 -0400 (0:00:00.099) 0:00:37.347 ******** ok: [managed-node11] TASK [Get unused disks] ******************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:76 Tuesday 19 August 2025 14:16:51 -0400 (0:00:01.978) 0:00:39.325 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml for managed-node11 TASK [Ensure test packages] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:2 Tuesday 19 August 2025 14:16:51 -0400 (0:00:00.413) 0:00:39.738 ******** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Find unused disks in the system] ***************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:11 Tuesday 19 August 2025 14:16:55 -0400 (0:00:04.323) 0:00:44.062 ******** ok: [managed-node11] => { "changed": false, "disks": [ "sda" ], "info": [ "Line: NAME=\"/dev/sda\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdb\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdc\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdd\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sde\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdf\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdg\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdh\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdi\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/xvda\" TYPE=\"disk\" SIZE=\"268435456000\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/xvda1\" TYPE=\"part\" SIZE=\"268434390528\" FSTYPE=\"xfs\" LOG-SEC=\"512\"", "Line type [part] is not disk: NAME=\"/dev/xvda1\" TYPE=\"part\" SIZE=\"268434390528\" FSTYPE=\"xfs\" LOG-SEC=\"512\"", "filename [xvda1] is a partition", "Disk [/dev/xvda] attrs [{'type': 'disk', 'size': '268435456000', 'fstype': '', 'ssize': '512'}] has partitions" ] } TASK [Debug why there are no unused disks] ************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:20 Tuesday 19 August 2025 14:16:58 -0400 (0:00:02.243) 0:00:46.305 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:29 Tuesday 19 August 2025 14:16:58 -0400 (0:00:00.143) 0:00:46.449 ******** ok: [managed-node11] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:34 Tuesday 19 August 2025 14:16:58 -0400 (0:00:00.208) 0:00:46.657 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:39 Tuesday 19 August 2025 14:16:58 -0400 (0:00:00.241) 0:00:46.898 ******** ok: [managed-node11] => { "unused_disks": [ "sda" ] } TASK [Test for correct handling of new encrypted volume w/ no key] ************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:85 Tuesday 19 August 2025 14:16:59 -0400 (0:00:00.290) 0:00:47.189 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node11 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Tuesday 19 August 2025 14:16:59 -0400 (0:00:00.345) 0:00:47.534 ******** ok: [managed-node11] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Tuesday 19 August 2025 14:16:59 -0400 (0:00:00.264) 0:00:47.799 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Tuesday 19 August 2025 14:17:00 -0400 (0:00:00.523) 0:00:48.322 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Tuesday 19 August 2025 14:17:00 -0400 (0:00:00.292) 0:00:48.615 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Tuesday 19 August 2025 14:17:00 -0400 (0:00:00.294) 0:00:48.909 ******** skipping: [managed-node11] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node11] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Tuesday 19 August 2025 14:17:01 -0400 (0:00:00.587) 0:00:49.496 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Tuesday 19 August 2025 14:17:01 -0400 (0:00:00.265) 0:00:49.762 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Tuesday 19 August 2025 14:17:01 -0400 (0:00:00.296) 0:00:50.058 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Tuesday 19 August 2025 14:17:02 -0400 (0:00:00.143) 0:00:50.202 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Tuesday 19 August 2025 14:17:02 -0400 (0:00:00.221) 0:00:50.423 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Tuesday 19 August 2025 14:17:02 -0400 (0:00:00.532) 0:00:50.955 ******** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Tuesday 19 August 2025 14:17:07 -0400 (0:00:04.751) 0:00:55.707 ******** ok: [managed-node11] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Tuesday 19 August 2025 14:17:07 -0400 (0:00:00.128) 0:00:55.836 ******** ok: [managed-node11] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Tuesday 19 August 2025 14:17:07 -0400 (0:00:00.131) 0:00:55.967 ******** ok: [managed-node11] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Tuesday 19 August 2025 14:17:13 -0400 (0:00:05.184) 0:01:01.152 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Tuesday 19 August 2025 14:17:13 -0400 (0:00:00.350) 0:01:01.502 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Tuesday 19 August 2025 14:17:13 -0400 (0:00:00.150) 0:01:01.653 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Tuesday 19 August 2025 14:17:13 -0400 (0:00:00.155) 0:01:01.808 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Tuesday 19 August 2025 14:17:13 -0400 (0:00:00.179) 0:01:01.987 ******** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Tuesday 19 August 2025 14:17:18 -0400 (0:00:04.317) 0:01:06.305 ******** ok: [managed-node11] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Tuesday 19 August 2025 14:17:20 -0400 (0:00:02.812) 0:01:09.117 ******** ok: [managed-node11] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Tuesday 19 August 2025 14:17:21 -0400 (0:00:00.174) 0:01:09.292 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Tuesday 19 August 2025 14:17:21 -0400 (0:00:00.091) 0:01:09.383 ******** fatal: [managed-node11]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'foo' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Tuesday 19 August 2025 14:17:25 -0400 (0:00:04.468) 0:01:13.852 ******** fatal: [managed-node11]: FAILED! => { "changed": false } MSG: {'msg': "encrypted volume 'foo' missing key/password", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [], 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10737418240, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Tuesday 19 August 2025 14:17:26 -0400 (0:00:00.608) 0:01:14.460 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Tuesday 19 August 2025 14:17:26 -0400 (0:00:00.189) 0:01:14.650 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Tuesday 19 August 2025 14:17:26 -0400 (0:00:00.166) 0:01:14.816 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Tuesday 19 August 2025 14:17:27 -0400 (0:00:00.347) 0:01:15.163 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create an encrypted disk volume w/ default fs] *************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:100 Tuesday 19 August 2025 14:17:27 -0400 (0:00:00.184) 0:01:15.348 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Tuesday 19 August 2025 14:17:27 -0400 (0:00:00.457) 0:01:15.805 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Tuesday 19 August 2025 14:17:28 -0400 (0:00:00.362) 0:01:16.168 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Tuesday 19 August 2025 14:17:28 -0400 (0:00:00.208) 0:01:16.377 ******** skipping: [managed-node11] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node11] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Tuesday 19 August 2025 14:17:28 -0400 (0:00:00.451) 0:01:16.828 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Tuesday 19 August 2025 14:17:28 -0400 (0:00:00.270) 0:01:17.099 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Tuesday 19 August 2025 14:17:29 -0400 (0:00:00.197) 0:01:17.296 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Tuesday 19 August 2025 14:17:29 -0400 (0:00:00.158) 0:01:17.455 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Tuesday 19 August 2025 14:17:29 -0400 (0:00:00.248) 0:01:17.703 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Tuesday 19 August 2025 14:17:30 -0400 (0:00:00.574) 0:01:18.278 ******** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Tuesday 19 August 2025 14:17:34 -0400 (0:00:04.307) 0:01:22.585 ******** ok: [managed-node11] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Tuesday 19 August 2025 14:17:34 -0400 (0:00:00.311) 0:01:22.896 ******** ok: [managed-node11] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Tuesday 19 August 2025 14:17:34 -0400 (0:00:00.224) 0:01:23.121 ******** ok: [managed-node11] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Tuesday 19 August 2025 14:17:39 -0400 (0:00:05.007) 0:01:28.128 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Tuesday 19 August 2025 14:17:40 -0400 (0:00:00.343) 0:01:28.471 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Tuesday 19 August 2025 14:17:40 -0400 (0:00:00.250) 0:01:28.722 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Tuesday 19 August 2025 14:17:40 -0400 (0:00:00.276) 0:01:28.998 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Tuesday 19 August 2025 14:17:41 -0400 (0:00:00.232) 0:01:29.231 ******** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Tuesday 19 August 2025 14:17:45 -0400 (0:00:04.507) 0:01:33.738 ******** ok: [managed-node11] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Tuesday 19 August 2025 14:17:48 -0400 (0:00:02.696) 0:01:36.434 ******** ok: [managed-node11] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Tuesday 19 August 2025 14:17:48 -0400 (0:00:00.323) 0:01:36.758 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Tuesday 19 August 2025 14:17:49 -0400 (0:00:00.521) 0:01:37.280 ******** changed: [managed-node11] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-673852e6-53e2-4ad0-a5cc-94ae9b8114d2", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-673852e6-53e2-4ad0-a5cc-94ae9b8114d2", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-673852e6-53e2-4ad0-a5cc-94ae9b8114d2", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-673852e6-53e2-4ad0-a5cc-94ae9b8114d2" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-673852e6-53e2-4ad0-a5cc-94ae9b8114d2", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-673852e6-53e2-4ad0-a5cc-94ae9b8114d2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-673852e6-53e2-4ad0-a5cc-94ae9b8114d2", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Tuesday 19 August 2025 14:18:02 -0400 (0:00:13.168) 0:01:50.448 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Tuesday 19 August 2025 14:18:02 -0400 (0:00:00.258) 0:01:50.707 ******** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1755626960.1106112, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "ab8070345774adad92683e9645714452be7be474", "ctime": 1755626958.2476, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 364904586, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1755626958.2476, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1343, "uid": 0, "version": "2238694571", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Tuesday 19 August 2025 14:18:03 -0400 (0:00:01.241) 0:01:51.949 ******** ok: [managed-node11] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Tuesday 19 August 2025 14:18:06 -0400 (0:00:02.488) 0:01:54.437 ******** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Tuesday 19 August 2025 14:18:06 -0400 (0:00:00.119) 0:01:54.557 ******** ok: [managed-node11] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-673852e6-53e2-4ad0-a5cc-94ae9b8114d2", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-673852e6-53e2-4ad0-a5cc-94ae9b8114d2", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-673852e6-53e2-4ad0-a5cc-94ae9b8114d2", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-673852e6-53e2-4ad0-a5cc-94ae9b8114d2" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-673852e6-53e2-4ad0-a5cc-94ae9b8114d2", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-673852e6-53e2-4ad0-a5cc-94ae9b8114d2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-673852e6-53e2-4ad0-a5cc-94ae9b8114d2", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Tuesday 19 August 2025 14:18:06 -0400 (0:00:00.256) 0:01:54.814 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Tuesday 19 August 2025 14:18:06 -0400 (0:00:00.276) 0:01:55.090 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-673852e6-53e2-4ad0-a5cc-94ae9b8114d2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-673852e6-53e2-4ad0-a5cc-94ae9b8114d2", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Tuesday 19 August 2025 14:18:07 -0400 (0:00:00.213) 0:01:55.304 ******** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Tuesday 19 August 2025 14:18:07 -0400 (0:00:00.112) 0:01:55.416 ******** ok: [managed-node11] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Tuesday 19 August 2025 14:18:11 -0400 (0:00:03.964) 0:01:59.381 ******** changed: [managed-node11] => (item={'src': '/dev/mapper/luks-673852e6-53e2-4ad0-a5cc-94ae9b8114d2', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-673852e6-53e2-4ad0-a5cc-94ae9b8114d2", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-673852e6-53e2-4ad0-a5cc-94ae9b8114d2" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Tuesday 19 August 2025 14:18:13 -0400 (0:00:02.623) 0:02:02.005 ******** skipping: [managed-node11] => (item={'src': '/dev/mapper/luks-673852e6-53e2-4ad0-a5cc-94ae9b8114d2', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-673852e6-53e2-4ad0-a5cc-94ae9b8114d2", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Tuesday 19 August 2025 14:18:14 -0400 (0:00:00.310) 0:02:02.316 ******** ok: [managed-node11] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Tuesday 19 August 2025 14:18:15 -0400 (0:00:01.586) 0:02:03.902 ******** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1755626059.7976253, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1716968941.893, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 135, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1716968586.525, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "1157759751", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Tuesday 19 August 2025 14:18:17 -0400 (0:00:01.338) 0:02:05.241 ******** changed: [managed-node11] => (item={'backing_device': '/dev/sda', 'name': 'luks-673852e6-53e2-4ad0-a5cc-94ae9b8114d2', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-673852e6-53e2-4ad0-a5cc-94ae9b8114d2", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Tuesday 19 August 2025 14:18:18 -0400 (0:00:01.526) 0:02:06.768 ******** ok: [managed-node11] TASK [Verify role results] ***************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:112 Tuesday 19 August 2025 14:18:20 -0400 (0:00:01.749) 0:02:08.517 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node11 TASK [Print out pool information] ********************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Tuesday 19 August 2025 14:18:20 -0400 (0:00:00.570) 0:02:09.088 ******** skipping: [managed-node11] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Tuesday 19 August 2025 14:18:21 -0400 (0:00:00.201) 0:02:09.289 ******** ok: [managed-node11] => { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-673852e6-53e2-4ad0-a5cc-94ae9b8114d2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-673852e6-53e2-4ad0-a5cc-94ae9b8114d2", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Tuesday 19 August 2025 14:18:21 -0400 (0:00:00.216) 0:02:09.506 ******** ok: [managed-node11] => { "changed": false, "info": { "/dev/mapper/luks-673852e6-53e2-4ad0-a5cc-94ae9b8114d2": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-673852e6-53e2-4ad0-a5cc-94ae9b8114d2", "size": "10G", "type": "crypt", "uuid": "2fa4bac3-0c8e-4bc4-9f4a-4f0149817c96" }, "/dev/sda": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "673852e6-53e2-4ad0-a5cc-94ae9b8114d2" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Tuesday 19 August 2025 14:18:23 -0400 (0:00:02.586) 0:02:12.092 ******** ok: [managed-node11] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002487", "end": "2025-08-19 14:18:26.194564", "rc": 0, "start": "2025-08-19 14:18:26.192077" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-673852e6-53e2-4ad0-a5cc-94ae9b8114d2 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Tuesday 19 August 2025 14:18:26 -0400 (0:00:02.587) 0:02:14.679 ******** ok: [managed-node11] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002498", "end": "2025-08-19 14:18:27.746362", "failed_when_result": false, "rc": 0, "start": "2025-08-19 14:18:27.743864" } STDOUT: luks-673852e6-53e2-4ad0-a5cc-94ae9b8114d2 /dev/sda - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Tuesday 19 August 2025 14:18:28 -0400 (0:00:01.837) 0:02:16.517 ******** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Tuesday 19 August 2025 14:18:28 -0400 (0:00:00.162) 0:02:16.679 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node11 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Tuesday 19 August 2025 14:18:28 -0400 (0:00:00.321) 0:02:17.001 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Tuesday 19 August 2025 14:18:29 -0400 (0:00:00.229) 0:02:17.230 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node11 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Tuesday 19 August 2025 14:18:30 -0400 (0:00:01.255) 0:02:18.486 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-673852e6-53e2-4ad0-a5cc-94ae9b8114d2" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Tuesday 19 August 2025 14:18:30 -0400 (0:00:00.227) 0:02:18.714 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Tuesday 19 August 2025 14:18:30 -0400 (0:00:00.388) 0:02:19.103 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Tuesday 19 August 2025 14:18:31 -0400 (0:00:00.297) 0:02:19.400 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Tuesday 19 August 2025 14:18:31 -0400 (0:00:00.201) 0:02:19.601 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Tuesday 19 August 2025 14:18:31 -0400 (0:00:00.356) 0:02:19.957 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Tuesday 19 August 2025 14:18:32 -0400 (0:00:00.302) 0:02:20.260 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Tuesday 19 August 2025 14:18:32 -0400 (0:00:00.204) 0:02:20.464 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Tuesday 19 August 2025 14:18:32 -0400 (0:00:00.145) 0:02:20.610 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Tuesday 19 August 2025 14:18:32 -0400 (0:00:00.302) 0:02:20.913 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Tuesday 19 August 2025 14:18:33 -0400 (0:00:00.240) 0:02:21.153 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Tuesday 19 August 2025 14:18:33 -0400 (0:00:00.127) 0:02:21.280 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-673852e6-53e2-4ad0-a5cc-94ae9b8114d2 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Tuesday 19 August 2025 14:18:33 -0400 (0:00:00.393) 0:02:21.674 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Tuesday 19 August 2025 14:18:33 -0400 (0:00:00.155) 0:02:21.830 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Tuesday 19 August 2025 14:18:33 -0400 (0:00:00.124) 0:02:21.955 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Tuesday 19 August 2025 14:18:33 -0400 (0:00:00.081) 0:02:22.037 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Tuesday 19 August 2025 14:18:34 -0400 (0:00:00.132) 0:02:22.169 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Tuesday 19 August 2025 14:18:34 -0400 (0:00:00.305) 0:02:22.475 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Tuesday 19 August 2025 14:18:34 -0400 (0:00:00.188) 0:02:22.663 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Tuesday 19 August 2025 14:18:34 -0400 (0:00:00.148) 0:02:22.811 ******** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1755627481.8036852, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1755627481.8036852, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 37038, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1755627481.8036852, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Tuesday 19 August 2025 14:18:35 -0400 (0:00:01.033) 0:02:23.845 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Tuesday 19 August 2025 14:18:35 -0400 (0:00:00.154) 0:02:24.000 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Tuesday 19 August 2025 14:18:36 -0400 (0:00:00.214) 0:02:24.214 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Tuesday 19 August 2025 14:18:36 -0400 (0:00:00.157) 0:02:24.372 ******** ok: [managed-node11] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Tuesday 19 August 2025 14:18:36 -0400 (0:00:00.238) 0:02:24.610 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Tuesday 19 August 2025 14:18:36 -0400 (0:00:00.238) 0:02:24.849 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Tuesday 19 August 2025 14:18:36 -0400 (0:00:00.245) 0:02:25.094 ******** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1755627481.952686, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1755627481.952686, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 149445, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1755627481.952686, "nlink": 1, "path": "/dev/mapper/luks-673852e6-53e2-4ad0-a5cc-94ae9b8114d2", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Tuesday 19 August 2025 14:18:38 -0400 (0:00:01.652) 0:02:26.747 ******** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Tuesday 19 August 2025 14:18:43 -0400 (0:00:04.744) 0:02:31.491 ******** ok: [managed-node11] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda" ], "delta": "0:00:00.011821", "end": "2025-08-19 14:18:44.661499", "rc": 0, "start": "2025-08-19 14:18:44.649678" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 673852e6-53e2-4ad0-a5cc-94ae9b8114d2 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 931965 Threads: 2 Salt: a8 95 54 54 3c 0e b1 e9 b2 8c 29 4f 68 8d 3f 57 ab 28 d1 7d 22 14 80 a8 52 4a 23 7a 33 59 a4 f8 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 120249 Salt: 1e 41 cc 9f 22 8a bf 09 0f 31 55 d4 4e 3b 74 78 14 dd 07 49 7b db 35 1e f2 8f 34 60 34 2c d5 5d Digest: 2e d5 ec a6 de 5e b8 de a3 27 69 e9 1e 65 8a 02 44 16 e2 27 f3 c8 fd fe eb b8 3f 12 01 be 6b 29 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Tuesday 19 August 2025 14:18:44 -0400 (0:00:01.580) 0:02:33.072 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Tuesday 19 August 2025 14:18:45 -0400 (0:00:00.247) 0:02:33.319 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Tuesday 19 August 2025 14:18:45 -0400 (0:00:00.267) 0:02:33.586 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Tuesday 19 August 2025 14:18:45 -0400 (0:00:00.306) 0:02:33.893 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Tuesday 19 August 2025 14:18:46 -0400 (0:00:00.263) 0:02:34.157 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Tuesday 19 August 2025 14:18:46 -0400 (0:00:00.299) 0:02:34.456 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Tuesday 19 August 2025 14:18:46 -0400 (0:00:00.256) 0:02:34.713 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Tuesday 19 August 2025 14:18:46 -0400 (0:00:00.333) 0:02:35.046 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-673852e6-53e2-4ad0-a5cc-94ae9b8114d2 /dev/sda -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Tuesday 19 August 2025 14:18:47 -0400 (0:00:00.269) 0:02:35.316 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Tuesday 19 August 2025 14:18:47 -0400 (0:00:00.254) 0:02:35.571 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Tuesday 19 August 2025 14:18:47 -0400 (0:00:00.229) 0:02:35.801 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Tuesday 19 August 2025 14:18:47 -0400 (0:00:00.223) 0:02:36.024 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Tuesday 19 August 2025 14:18:48 -0400 (0:00:00.259) 0:02:36.283 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Tuesday 19 August 2025 14:18:48 -0400 (0:00:00.182) 0:02:36.466 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Tuesday 19 August 2025 14:18:48 -0400 (0:00:00.292) 0:02:36.759 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Tuesday 19 August 2025 14:18:48 -0400 (0:00:00.268) 0:02:37.027 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Tuesday 19 August 2025 14:18:49 -0400 (0:00:00.190) 0:02:37.218 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Tuesday 19 August 2025 14:18:49 -0400 (0:00:00.314) 0:02:37.532 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Tuesday 19 August 2025 14:18:49 -0400 (0:00:00.388) 0:02:37.921 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Tuesday 19 August 2025 14:18:49 -0400 (0:00:00.216) 0:02:38.138 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Tuesday 19 August 2025 14:18:50 -0400 (0:00:00.296) 0:02:38.434 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Tuesday 19 August 2025 14:18:50 -0400 (0:00:00.252) 0:02:38.686 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Tuesday 19 August 2025 14:18:50 -0400 (0:00:00.285) 0:02:38.971 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Tuesday 19 August 2025 14:18:51 -0400 (0:00:00.356) 0:02:39.328 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Tuesday 19 August 2025 14:18:51 -0400 (0:00:00.223) 0:02:39.552 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Tuesday 19 August 2025 14:18:51 -0400 (0:00:00.265) 0:02:39.818 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Tuesday 19 August 2025 14:18:51 -0400 (0:00:00.285) 0:02:40.103 ******** ok: [managed-node11] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Tuesday 19 August 2025 14:18:52 -0400 (0:00:00.204) 0:02:40.308 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Tuesday 19 August 2025 14:18:52 -0400 (0:00:00.484) 0:02:40.792 ******** skipping: [managed-node11] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Tuesday 19 August 2025 14:18:52 -0400 (0:00:00.200) 0:02:40.993 ******** skipping: [managed-node11] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Tuesday 19 August 2025 14:18:53 -0400 (0:00:00.179) 0:02:41.173 ******** skipping: [managed-node11] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Tuesday 19 August 2025 14:18:53 -0400 (0:00:00.270) 0:02:41.443 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Tuesday 19 August 2025 14:18:53 -0400 (0:00:00.252) 0:02:41.696 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Tuesday 19 August 2025 14:18:53 -0400 (0:00:00.241) 0:02:41.938 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Tuesday 19 August 2025 14:18:53 -0400 (0:00:00.199) 0:02:42.138 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Tuesday 19 August 2025 14:18:54 -0400 (0:00:00.281) 0:02:42.419 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Tuesday 19 August 2025 14:18:54 -0400 (0:00:00.344) 0:02:42.764 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Tuesday 19 August 2025 14:18:54 -0400 (0:00:00.285) 0:02:43.049 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Tuesday 19 August 2025 14:18:55 -0400 (0:00:00.331) 0:02:43.380 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Tuesday 19 August 2025 14:18:55 -0400 (0:00:00.263) 0:02:43.644 ******** skipping: [managed-node11] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Tuesday 19 August 2025 14:18:55 -0400 (0:00:00.227) 0:02:43.871 ******** skipping: [managed-node11] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Tuesday 19 August 2025 14:18:55 -0400 (0:00:00.180) 0:02:44.052 ******** skipping: [managed-node11] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Tuesday 19 August 2025 14:18:56 -0400 (0:00:00.208) 0:02:44.261 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Tuesday 19 August 2025 14:18:56 -0400 (0:00:00.258) 0:02:44.519 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Tuesday 19 August 2025 14:18:56 -0400 (0:00:00.227) 0:02:44.747 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Tuesday 19 August 2025 14:18:56 -0400 (0:00:00.336) 0:02:45.083 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Tuesday 19 August 2025 14:18:57 -0400 (0:00:00.328) 0:02:45.411 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Tuesday 19 August 2025 14:18:57 -0400 (0:00:00.228) 0:02:45.640 ******** ok: [managed-node11] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Tuesday 19 August 2025 14:18:57 -0400 (0:00:00.357) 0:02:45.998 ******** ok: [managed-node11] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Tuesday 19 August 2025 14:18:58 -0400 (0:00:00.298) 0:02:46.297 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Tuesday 19 August 2025 14:18:58 -0400 (0:00:00.238) 0:02:46.536 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Tuesday 19 August 2025 14:18:58 -0400 (0:00:00.350) 0:02:46.886 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Tuesday 19 August 2025 14:18:59 -0400 (0:00:00.277) 0:02:47.164 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Tuesday 19 August 2025 14:18:59 -0400 (0:00:00.333) 0:02:47.498 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Tuesday 19 August 2025 14:18:59 -0400 (0:00:00.314) 0:02:47.812 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Tuesday 19 August 2025 14:19:00 -0400 (0:00:00.529) 0:02:48.342 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Tuesday 19 August 2025 14:19:00 -0400 (0:00:00.378) 0:02:48.721 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Tuesday 19 August 2025 14:19:00 -0400 (0:00:00.159) 0:02:48.881 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Tuesday 19 August 2025 14:19:00 -0400 (0:00:00.165) 0:02:49.047 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Tuesday 19 August 2025 14:19:01 -0400 (0:00:00.194) 0:02:49.242 ******** changed: [managed-node11] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:118 Tuesday 19 August 2025 14:19:03 -0400 (0:00:02.194) 0:02:51.436 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node11 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Tuesday 19 August 2025 14:19:03 -0400 (0:00:00.395) 0:02:51.832 ******** ok: [managed-node11] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Tuesday 19 August 2025 14:19:04 -0400 (0:00:00.317) 0:02:52.150 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Tuesday 19 August 2025 14:19:04 -0400 (0:00:00.336) 0:02:52.486 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Tuesday 19 August 2025 14:19:04 -0400 (0:00:00.399) 0:02:52.886 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Tuesday 19 August 2025 14:19:04 -0400 (0:00:00.223) 0:02:53.110 ******** skipping: [managed-node11] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node11] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Tuesday 19 August 2025 14:19:05 -0400 (0:00:00.523) 0:02:53.634 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Tuesday 19 August 2025 14:19:05 -0400 (0:00:00.296) 0:02:53.930 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Tuesday 19 August 2025 14:19:06 -0400 (0:00:00.312) 0:02:54.242 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Tuesday 19 August 2025 14:19:06 -0400 (0:00:00.184) 0:02:54.426 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Tuesday 19 August 2025 14:19:06 -0400 (0:00:00.235) 0:02:54.661 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Tuesday 19 August 2025 14:19:07 -0400 (0:00:00.501) 0:02:55.163 ******** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Tuesday 19 August 2025 14:19:11 -0400 (0:00:04.628) 0:02:59.791 ******** ok: [managed-node11] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Tuesday 19 August 2025 14:19:11 -0400 (0:00:00.239) 0:03:00.030 ******** ok: [managed-node11] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Tuesday 19 August 2025 14:19:12 -0400 (0:00:00.192) 0:03:00.223 ******** ok: [managed-node11] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Tuesday 19 August 2025 14:19:17 -0400 (0:00:05.606) 0:03:05.830 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Tuesday 19 August 2025 14:19:18 -0400 (0:00:00.425) 0:03:06.256 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Tuesday 19 August 2025 14:19:18 -0400 (0:00:00.263) 0:03:06.520 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Tuesday 19 August 2025 14:19:18 -0400 (0:00:00.271) 0:03:06.791 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Tuesday 19 August 2025 14:19:18 -0400 (0:00:00.197) 0:03:06.989 ******** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Tuesday 19 August 2025 14:19:23 -0400 (0:00:04.915) 0:03:11.904 ******** ok: [managed-node11] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Tuesday 19 August 2025 14:19:27 -0400 (0:00:03.354) 0:03:15.259 ******** ok: [managed-node11] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Tuesday 19 August 2025 14:19:27 -0400 (0:00:00.617) 0:03:15.876 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Tuesday 19 August 2025 14:19:27 -0400 (0:00:00.242) 0:03:16.119 ******** fatal: [managed-node11]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-673852e6-53e2-4ad0-a5cc-94ae9b8114d2' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Tuesday 19 August 2025 14:19:33 -0400 (0:00:05.327) 0:03:21.446 ******** fatal: [managed-node11]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'luks-673852e6-53e2-4ad0-a5cc-94ae9b8114d2' in safe mode due to encryption removal", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [], 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10720641024, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Tuesday 19 August 2025 14:19:33 -0400 (0:00:00.229) 0:03:21.676 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Tuesday 19 August 2025 14:19:33 -0400 (0:00:00.214) 0:03:21.890 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Tuesday 19 August 2025 14:19:34 -0400 (0:00:00.322) 0:03:22.212 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Tuesday 19 August 2025 14:19:34 -0400 (0:00:00.402) 0:03:22.614 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Tuesday 19 August 2025 14:19:34 -0400 (0:00:00.224) 0:03:22.838 ******** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1755627542.942048, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1755627542.942048, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1755627542.942048, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "2861891019", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Tuesday 19 August 2025 14:19:36 -0400 (0:00:01.573) 0:03:24.412 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:138 Tuesday 19 August 2025 14:19:36 -0400 (0:00:00.291) 0:03:24.703 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Tuesday 19 August 2025 14:19:37 -0400 (0:00:00.540) 0:03:25.243 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Tuesday 19 August 2025 14:19:37 -0400 (0:00:00.250) 0:03:25.494 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Tuesday 19 August 2025 14:19:37 -0400 (0:00:00.138) 0:03:25.633 ******** skipping: [managed-node11] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node11] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Tuesday 19 August 2025 14:19:38 -0400 (0:00:00.533) 0:03:26.166 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Tuesday 19 August 2025 14:19:38 -0400 (0:00:00.158) 0:03:26.324 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Tuesday 19 August 2025 14:19:38 -0400 (0:00:00.207) 0:03:26.532 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Tuesday 19 August 2025 14:19:38 -0400 (0:00:00.236) 0:03:26.768 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Tuesday 19 August 2025 14:19:38 -0400 (0:00:00.186) 0:03:26.954 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Tuesday 19 August 2025 14:19:39 -0400 (0:00:00.397) 0:03:27.352 ******** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Tuesday 19 August 2025 14:19:43 -0400 (0:00:04.292) 0:03:31.644 ******** ok: [managed-node11] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Tuesday 19 August 2025 14:19:43 -0400 (0:00:00.199) 0:03:31.843 ******** ok: [managed-node11] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Tuesday 19 August 2025 14:19:43 -0400 (0:00:00.295) 0:03:32.139 ******** ok: [managed-node11] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Tuesday 19 August 2025 14:19:49 -0400 (0:00:05.777) 0:03:37.917 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Tuesday 19 August 2025 14:19:50 -0400 (0:00:00.311) 0:03:38.228 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Tuesday 19 August 2025 14:19:50 -0400 (0:00:00.218) 0:03:38.447 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Tuesday 19 August 2025 14:19:50 -0400 (0:00:00.202) 0:03:38.650 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Tuesday 19 August 2025 14:19:50 -0400 (0:00:00.151) 0:03:38.801 ******** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Tuesday 19 August 2025 14:19:55 -0400 (0:00:04.349) 0:03:43.150 ******** ok: [managed-node11] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Tuesday 19 August 2025 14:19:57 -0400 (0:00:02.915) 0:03:46.065 ******** ok: [managed-node11] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Tuesday 19 August 2025 14:19:58 -0400 (0:00:00.329) 0:03:46.395 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Tuesday 19 August 2025 14:19:58 -0400 (0:00:00.146) 0:03:46.542 ******** changed: [managed-node11] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-673852e6-53e2-4ad0-a5cc-94ae9b8114d2", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-673852e6-53e2-4ad0-a5cc-94ae9b8114d2", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-673852e6-53e2-4ad0-a5cc-94ae9b8114d2", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-673852e6-53e2-4ad0-a5cc-94ae9b8114d2", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=e98c89cf-b21a-4f16-bd62-99809e9d3d70", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=e98c89cf-b21a-4f16-bd62-99809e9d3d70", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Tuesday 19 August 2025 14:20:03 -0400 (0:00:05.273) 0:03:51.816 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Tuesday 19 August 2025 14:20:03 -0400 (0:00:00.272) 0:03:52.088 ******** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1755627493.5407548, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "e81c1ced817ff1bb1c1496144669b934d7005872", "ctime": 1755627493.5367548, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 364904586, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1755627493.5367548, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "2238694571", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Tuesday 19 August 2025 14:20:05 -0400 (0:00:01.589) 0:03:53.678 ******** ok: [managed-node11] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Tuesday 19 August 2025 14:20:07 -0400 (0:00:01.668) 0:03:55.346 ******** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Tuesday 19 August 2025 14:20:07 -0400 (0:00:00.124) 0:03:55.470 ******** ok: [managed-node11] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-673852e6-53e2-4ad0-a5cc-94ae9b8114d2", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-673852e6-53e2-4ad0-a5cc-94ae9b8114d2", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-673852e6-53e2-4ad0-a5cc-94ae9b8114d2", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-673852e6-53e2-4ad0-a5cc-94ae9b8114d2", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=e98c89cf-b21a-4f16-bd62-99809e9d3d70", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=e98c89cf-b21a-4f16-bd62-99809e9d3d70", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Tuesday 19 August 2025 14:20:07 -0400 (0:00:00.226) 0:03:55.697 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Tuesday 19 August 2025 14:20:07 -0400 (0:00:00.237) 0:03:55.935 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=e98c89cf-b21a-4f16-bd62-99809e9d3d70", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Tuesday 19 August 2025 14:20:08 -0400 (0:00:00.295) 0:03:56.230 ******** changed: [managed-node11] => (item={'src': '/dev/mapper/luks-673852e6-53e2-4ad0-a5cc-94ae9b8114d2', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-673852e6-53e2-4ad0-a5cc-94ae9b8114d2", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-673852e6-53e2-4ad0-a5cc-94ae9b8114d2" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Tuesday 19 August 2025 14:20:09 -0400 (0:00:01.372) 0:03:57.603 ******** ok: [managed-node11] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Tuesday 19 August 2025 14:20:10 -0400 (0:00:01.533) 0:03:59.137 ******** changed: [managed-node11] => (item={'src': 'UUID=e98c89cf-b21a-4f16-bd62-99809e9d3d70', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=e98c89cf-b21a-4f16-bd62-99809e9d3d70", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=e98c89cf-b21a-4f16-bd62-99809e9d3d70" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Tuesday 19 August 2025 14:20:12 -0400 (0:00:01.567) 0:04:00.704 ******** skipping: [managed-node11] => (item={'src': 'UUID=e98c89cf-b21a-4f16-bd62-99809e9d3d70', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=e98c89cf-b21a-4f16-bd62-99809e9d3d70", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Tuesday 19 August 2025 14:20:12 -0400 (0:00:00.208) 0:04:00.912 ******** ok: [managed-node11] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Tuesday 19 August 2025 14:20:14 -0400 (0:00:01.824) 0:04:02.737 ******** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1755627507.7448394, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "78f7bbf62aceb29cb607df9309cdc3b6d1eaae0e", "ctime": 1755627498.3467834, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 213909708, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1755627498.3447835, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 53, "uid": 0, "version": "524728211", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Tuesday 19 August 2025 14:20:16 -0400 (0:00:01.581) 0:04:04.318 ******** changed: [managed-node11] => (item={'backing_device': '/dev/sda', 'name': 'luks-673852e6-53e2-4ad0-a5cc-94ae9b8114d2', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-673852e6-53e2-4ad0-a5cc-94ae9b8114d2", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Tuesday 19 August 2025 14:20:17 -0400 (0:00:01.578) 0:04:05.897 ******** ok: [managed-node11] TASK [Verify role results - 2] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:151 Tuesday 19 August 2025 14:20:19 -0400 (0:00:01.874) 0:04:07.771 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node11 TASK [Print out pool information] ********************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Tuesday 19 August 2025 14:20:20 -0400 (0:00:00.526) 0:04:08.298 ******** skipping: [managed-node11] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Tuesday 19 August 2025 14:20:20 -0400 (0:00:00.185) 0:04:08.483 ******** ok: [managed-node11] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=e98c89cf-b21a-4f16-bd62-99809e9d3d70", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Tuesday 19 August 2025 14:20:20 -0400 (0:00:00.297) 0:04:08.781 ******** ok: [managed-node11] => { "changed": false, "info": { "/dev/sda": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "e98c89cf-b21a-4f16-bd62-99809e9d3d70" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Tuesday 19 August 2025 14:20:22 -0400 (0:00:01.382) 0:04:10.164 ******** ok: [managed-node11] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002124", "end": "2025-08-19 14:20:23.014704", "rc": 0, "start": "2025-08-19 14:20:23.012580" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 UUID=e98c89cf-b21a-4f16-bd62-99809e9d3d70 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Tuesday 19 August 2025 14:20:23 -0400 (0:00:01.303) 0:04:11.467 ******** ok: [managed-node11] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002266", "end": "2025-08-19 14:20:24.411704", "failed_when_result": false, "rc": 0, "start": "2025-08-19 14:20:24.409438" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Tuesday 19 August 2025 14:20:24 -0400 (0:00:01.349) 0:04:12.816 ******** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Tuesday 19 August 2025 14:20:24 -0400 (0:00:00.156) 0:04:12.973 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node11 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Tuesday 19 August 2025 14:20:25 -0400 (0:00:00.296) 0:04:13.270 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Tuesday 19 August 2025 14:20:25 -0400 (0:00:00.199) 0:04:13.469 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node11 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Tuesday 19 August 2025 14:20:26 -0400 (0:00:00.882) 0:04:14.352 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Tuesday 19 August 2025 14:20:26 -0400 (0:00:00.198) 0:04:14.551 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Tuesday 19 August 2025 14:20:26 -0400 (0:00:00.314) 0:04:14.866 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Tuesday 19 August 2025 14:20:27 -0400 (0:00:00.318) 0:04:15.184 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Tuesday 19 August 2025 14:20:27 -0400 (0:00:00.268) 0:04:15.453 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Tuesday 19 August 2025 14:20:27 -0400 (0:00:00.308) 0:04:15.761 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Tuesday 19 August 2025 14:20:27 -0400 (0:00:00.263) 0:04:16.025 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Tuesday 19 August 2025 14:20:28 -0400 (0:00:00.257) 0:04:16.282 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Tuesday 19 August 2025 14:20:28 -0400 (0:00:00.255) 0:04:16.538 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Tuesday 19 August 2025 14:20:28 -0400 (0:00:00.258) 0:04:16.796 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Tuesday 19 August 2025 14:20:28 -0400 (0:00:00.285) 0:04:17.081 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Tuesday 19 August 2025 14:20:29 -0400 (0:00:00.283) 0:04:17.365 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=e98c89cf-b21a-4f16-bd62-99809e9d3d70 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Tuesday 19 August 2025 14:20:29 -0400 (0:00:00.366) 0:04:17.732 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Tuesday 19 August 2025 14:20:29 -0400 (0:00:00.337) 0:04:18.069 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Tuesday 19 August 2025 14:20:30 -0400 (0:00:00.281) 0:04:18.351 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Tuesday 19 August 2025 14:20:30 -0400 (0:00:00.315) 0:04:18.666 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Tuesday 19 August 2025 14:20:31 -0400 (0:00:00.706) 0:04:19.373 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Tuesday 19 August 2025 14:20:31 -0400 (0:00:00.210) 0:04:19.584 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Tuesday 19 August 2025 14:20:31 -0400 (0:00:00.366) 0:04:19.950 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Tuesday 19 August 2025 14:20:32 -0400 (0:00:00.252) 0:04:20.203 ******** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1755627603.3434055, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1755627603.3434055, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 37038, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1755627603.3434055, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Tuesday 19 August 2025 14:20:33 -0400 (0:00:01.463) 0:04:21.667 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Tuesday 19 August 2025 14:20:33 -0400 (0:00:00.257) 0:04:21.925 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Tuesday 19 August 2025 14:20:34 -0400 (0:00:00.232) 0:04:22.157 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Tuesday 19 August 2025 14:20:34 -0400 (0:00:00.198) 0:04:22.356 ******** ok: [managed-node11] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Tuesday 19 August 2025 14:20:34 -0400 (0:00:00.182) 0:04:22.538 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Tuesday 19 August 2025 14:20:34 -0400 (0:00:00.302) 0:04:22.841 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Tuesday 19 August 2025 14:20:34 -0400 (0:00:00.275) 0:04:23.116 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Tuesday 19 August 2025 14:20:35 -0400 (0:00:00.234) 0:04:23.351 ******** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Tuesday 19 August 2025 14:20:39 -0400 (0:00:04.554) 0:04:27.905 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Tuesday 19 August 2025 14:20:40 -0400 (0:00:00.312) 0:04:28.218 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Tuesday 19 August 2025 14:20:40 -0400 (0:00:00.156) 0:04:28.374 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Tuesday 19 August 2025 14:20:40 -0400 (0:00:00.221) 0:04:28.596 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Tuesday 19 August 2025 14:20:40 -0400 (0:00:00.190) 0:04:28.786 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Tuesday 19 August 2025 14:20:40 -0400 (0:00:00.227) 0:04:29.014 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Tuesday 19 August 2025 14:20:41 -0400 (0:00:00.177) 0:04:29.191 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Tuesday 19 August 2025 14:20:41 -0400 (0:00:00.199) 0:04:29.391 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Tuesday 19 August 2025 14:20:41 -0400 (0:00:00.102) 0:04:29.494 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Tuesday 19 August 2025 14:20:41 -0400 (0:00:00.166) 0:04:29.661 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Tuesday 19 August 2025 14:20:41 -0400 (0:00:00.201) 0:04:29.863 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Tuesday 19 August 2025 14:20:42 -0400 (0:00:00.303) 0:04:30.166 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Tuesday 19 August 2025 14:20:42 -0400 (0:00:00.167) 0:04:30.334 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Tuesday 19 August 2025 14:20:42 -0400 (0:00:00.268) 0:04:30.602 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Tuesday 19 August 2025 14:20:42 -0400 (0:00:00.213) 0:04:30.816 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Tuesday 19 August 2025 14:20:42 -0400 (0:00:00.223) 0:04:31.039 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Tuesday 19 August 2025 14:20:43 -0400 (0:00:00.251) 0:04:31.291 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Tuesday 19 August 2025 14:20:43 -0400 (0:00:00.281) 0:04:31.573 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Tuesday 19 August 2025 14:20:43 -0400 (0:00:00.134) 0:04:31.707 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Tuesday 19 August 2025 14:20:43 -0400 (0:00:00.232) 0:04:31.939 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Tuesday 19 August 2025 14:20:43 -0400 (0:00:00.161) 0:04:32.101 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Tuesday 19 August 2025 14:20:44 -0400 (0:00:00.231) 0:04:32.332 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Tuesday 19 August 2025 14:20:44 -0400 (0:00:00.372) 0:04:32.705 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Tuesday 19 August 2025 14:20:44 -0400 (0:00:00.258) 0:04:32.963 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Tuesday 19 August 2025 14:20:45 -0400 (0:00:00.223) 0:04:33.186 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Tuesday 19 August 2025 14:20:45 -0400 (0:00:00.281) 0:04:33.467 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Tuesday 19 August 2025 14:20:45 -0400 (0:00:00.183) 0:04:33.651 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Tuesday 19 August 2025 14:20:45 -0400 (0:00:00.346) 0:04:33.998 ******** ok: [managed-node11] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Tuesday 19 August 2025 14:20:46 -0400 (0:00:00.263) 0:04:34.261 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Tuesday 19 August 2025 14:20:46 -0400 (0:00:00.254) 0:04:34.516 ******** skipping: [managed-node11] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Tuesday 19 August 2025 14:20:46 -0400 (0:00:00.219) 0:04:34.736 ******** skipping: [managed-node11] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Tuesday 19 August 2025 14:20:46 -0400 (0:00:00.170) 0:04:34.907 ******** skipping: [managed-node11] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Tuesday 19 August 2025 14:20:46 -0400 (0:00:00.177) 0:04:35.084 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Tuesday 19 August 2025 14:20:47 -0400 (0:00:00.252) 0:04:35.336 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Tuesday 19 August 2025 14:20:47 -0400 (0:00:00.200) 0:04:35.537 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Tuesday 19 August 2025 14:20:47 -0400 (0:00:00.261) 0:04:35.798 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Tuesday 19 August 2025 14:20:47 -0400 (0:00:00.207) 0:04:36.006 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Tuesday 19 August 2025 14:20:47 -0400 (0:00:00.132) 0:04:36.138 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Tuesday 19 August 2025 14:20:48 -0400 (0:00:00.235) 0:04:36.374 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Tuesday 19 August 2025 14:20:48 -0400 (0:00:00.154) 0:04:36.529 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Tuesday 19 August 2025 14:20:48 -0400 (0:00:00.195) 0:04:36.725 ******** skipping: [managed-node11] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Tuesday 19 August 2025 14:20:48 -0400 (0:00:00.240) 0:04:36.966 ******** skipping: [managed-node11] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Tuesday 19 August 2025 14:20:48 -0400 (0:00:00.176) 0:04:37.143 ******** skipping: [managed-node11] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Tuesday 19 August 2025 14:20:49 -0400 (0:00:00.303) 0:04:37.447 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Tuesday 19 August 2025 14:20:49 -0400 (0:00:00.254) 0:04:37.701 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Tuesday 19 August 2025 14:20:49 -0400 (0:00:00.355) 0:04:38.057 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Tuesday 19 August 2025 14:20:50 -0400 (0:00:00.222) 0:04:38.279 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Tuesday 19 August 2025 14:20:50 -0400 (0:00:00.344) 0:04:38.624 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Tuesday 19 August 2025 14:20:50 -0400 (0:00:00.296) 0:04:38.920 ******** ok: [managed-node11] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Tuesday 19 August 2025 14:20:51 -0400 (0:00:00.303) 0:04:39.224 ******** ok: [managed-node11] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Tuesday 19 August 2025 14:20:51 -0400 (0:00:00.274) 0:04:39.498 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Tuesday 19 August 2025 14:20:51 -0400 (0:00:00.069) 0:04:39.568 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Tuesday 19 August 2025 14:20:51 -0400 (0:00:00.099) 0:04:39.667 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Tuesday 19 August 2025 14:20:51 -0400 (0:00:00.302) 0:04:39.969 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Tuesday 19 August 2025 14:20:51 -0400 (0:00:00.098) 0:04:40.068 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Tuesday 19 August 2025 14:20:52 -0400 (0:00:00.163) 0:04:40.231 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Tuesday 19 August 2025 14:20:52 -0400 (0:00:00.194) 0:04:40.426 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Tuesday 19 August 2025 14:20:52 -0400 (0:00:00.176) 0:04:40.603 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Tuesday 19 August 2025 14:20:52 -0400 (0:00:00.225) 0:04:40.829 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Tuesday 19 August 2025 14:20:52 -0400 (0:00:00.219) 0:04:41.048 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Tuesday 19 August 2025 14:20:53 -0400 (0:00:00.210) 0:04:41.259 ******** changed: [managed-node11] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 2] ****************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:157 Tuesday 19 August 2025 14:20:54 -0400 (0:00:01.364) 0:04:42.624 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node11 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Tuesday 19 August 2025 14:20:54 -0400 (0:00:00.323) 0:04:42.947 ******** ok: [managed-node11] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Tuesday 19 August 2025 14:20:55 -0400 (0:00:00.226) 0:04:43.173 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Tuesday 19 August 2025 14:20:55 -0400 (0:00:00.264) 0:04:43.438 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Tuesday 19 August 2025 14:20:55 -0400 (0:00:00.329) 0:04:43.768 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Tuesday 19 August 2025 14:20:55 -0400 (0:00:00.231) 0:04:43.999 ******** skipping: [managed-node11] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node11] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Tuesday 19 August 2025 14:20:56 -0400 (0:00:00.487) 0:04:44.487 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Tuesday 19 August 2025 14:20:56 -0400 (0:00:00.260) 0:04:44.747 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Tuesday 19 August 2025 14:20:56 -0400 (0:00:00.318) 0:04:45.065 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Tuesday 19 August 2025 14:20:57 -0400 (0:00:00.193) 0:04:45.259 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Tuesday 19 August 2025 14:20:57 -0400 (0:00:00.228) 0:04:45.487 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Tuesday 19 August 2025 14:20:57 -0400 (0:00:00.569) 0:04:46.057 ******** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Tuesday 19 August 2025 14:21:02 -0400 (0:00:04.210) 0:04:50.268 ******** ok: [managed-node11] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Tuesday 19 August 2025 14:21:02 -0400 (0:00:00.139) 0:04:50.407 ******** ok: [managed-node11] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Tuesday 19 August 2025 14:21:02 -0400 (0:00:00.123) 0:04:50.531 ******** ok: [managed-node11] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Tuesday 19 August 2025 14:21:07 -0400 (0:00:04.714) 0:04:55.245 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Tuesday 19 August 2025 14:21:07 -0400 (0:00:00.247) 0:04:55.493 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Tuesday 19 August 2025 14:21:07 -0400 (0:00:00.106) 0:04:55.600 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Tuesday 19 August 2025 14:21:07 -0400 (0:00:00.094) 0:04:55.695 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Tuesday 19 August 2025 14:21:07 -0400 (0:00:00.140) 0:04:55.835 ******** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Tuesday 19 August 2025 14:21:11 -0400 (0:00:03.869) 0:04:59.705 ******** ok: [managed-node11] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...d53e2\\x2d4ad0\\x2da5cc\\x2d94ae9b8114d2.service": { "name": "systemd-cryptsetup@luk...d53e2\\x2d4ad0\\x2da5cc\\x2d94ae9b8114d2.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2d673852e6\\x2d53e2\\x2d4ad0\\x2da5cc\\x2d94ae9b8114d2.service": { "name": "systemd-cryptsetup@luks\\x2d673852e6\\x2d53e2\\x2d4ad0\\x2da5cc\\x2d94ae9b8114d2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Tuesday 19 August 2025 14:21:14 -0400 (0:00:02.668) 0:05:02.373 ******** ok: [managed-node11] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d673852e6\\x2d53e2\\x2d4ad0\\x2da5cc\\x2d94ae9b8114d2.service", "systemd-cryptsetup@luk...d53e2\\x2d4ad0\\x2da5cc\\x2d94ae9b8114d2.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Tuesday 19 August 2025 14:21:14 -0400 (0:00:00.173) 0:05:02.546 ******** changed: [managed-node11] => (item=systemd-cryptsetup@luks\x2d673852e6\x2d53e2\x2d4ad0\x2da5cc\x2d94ae9b8114d2.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d673852e6\\x2d53e2\\x2d4ad0\\x2da5cc\\x2d94ae9b8114d2.service", "name": "systemd-cryptsetup@luks\\x2d673852e6\\x2d53e2\\x2d4ad0\\x2da5cc\\x2d94ae9b8114d2.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-journald.socket system-systemd\\x2dcryptsetup.slice dev-sda.device cryptsetup-pre.target", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-sda.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-673852e6-53e2-4ad0-a5cc-94ae9b8114d2", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-673852e6-53e2-4ad0-a5cc-94ae9b8114d2 /dev/sda - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-673852e6-53e2-4ad0-a5cc-94ae9b8114d2 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d673852e6\\x2d53e2\\x2d4ad0\\x2da5cc\\x2d94ae9b8114d2.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d673852e6\\x2d53e2\\x2d4ad0\\x2da5cc\\x2d94ae9b8114d2.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d673852e6\\x2d53e2\\x2d4ad0\\x2da5cc\\x2d94ae9b8114d2.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Tue 2025-08-19 14:20:14 EDT", "StateChangeTimestampMonotonic": "1931232378", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-sda.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node11] => (item=systemd-cryptsetup@luk...d53e2\x2d4ad0\x2da5cc\x2d94ae9b8114d2.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d53e2\\x2d4ad0\\x2da5cc\\x2d94ae9b8114d2.service", "name": "systemd-cryptsetup@luk...d53e2\\x2d4ad0\\x2da5cc\\x2d94ae9b8114d2.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d53e2\\x2d4ad0\\x2da5cc\\x2d94ae9b8114d2.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d53e2\\x2d4ad0\\x2da5cc\\x2d94ae9b8114d2.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...d53e2\\x2d4ad0\\x2da5cc\\x2d94ae9b8114d2.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d53e2\\x2d4ad0\\x2da5cc\\x2d94ae9b8114d2.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Tuesday 19 August 2025 14:21:17 -0400 (0:00:02.766) 0:05:05.313 ******** fatal: [managed-node11]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'sda' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Tuesday 19 August 2025 14:21:22 -0400 (0:00:05.230) 0:05:10.543 ******** fatal: [managed-node11]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'sda' in safe mode due to adding encryption", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [], 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10737418240, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Tuesday 19 August 2025 14:21:22 -0400 (0:00:00.123) 0:05:10.667 ******** changed: [managed-node11] => (item=systemd-cryptsetup@luks\x2d673852e6\x2d53e2\x2d4ad0\x2da5cc\x2d94ae9b8114d2.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d673852e6\\x2d53e2\\x2d4ad0\\x2da5cc\\x2d94ae9b8114d2.service", "name": "systemd-cryptsetup@luks\\x2d673852e6\\x2d53e2\\x2d4ad0\\x2da5cc\\x2d94ae9b8114d2.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d673852e6\\x2d53e2\\x2d4ad0\\x2da5cc\\x2d94ae9b8114d2.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d673852e6\\x2d53e2\\x2d4ad0\\x2da5cc\\x2d94ae9b8114d2.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d673852e6\\x2d53e2\\x2d4ad0\\x2da5cc\\x2d94ae9b8114d2.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d673852e6\\x2d53e2\\x2d4ad0\\x2da5cc\\x2d94ae9b8114d2.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node11] => (item=systemd-cryptsetup@luk...d53e2\x2d4ad0\x2da5cc\x2d94ae9b8114d2.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d53e2\\x2d4ad0\\x2da5cc\\x2d94ae9b8114d2.service", "name": "systemd-cryptsetup@luk...d53e2\\x2d4ad0\\x2da5cc\\x2d94ae9b8114d2.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d53e2\\x2d4ad0\\x2da5cc\\x2d94ae9b8114d2.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d53e2\\x2d4ad0\\x2da5cc\\x2d94ae9b8114d2.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...d53e2\\x2d4ad0\\x2da5cc\\x2d94ae9b8114d2.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d53e2\\x2d4ad0\\x2da5cc\\x2d94ae9b8114d2.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Tuesday 19 August 2025 14:21:25 -0400 (0:00:02.481) 0:05:13.148 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Tuesday 19 August 2025 14:21:25 -0400 (0:00:00.238) 0:05:13.387 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Tuesday 19 August 2025 14:21:25 -0400 (0:00:00.335) 0:05:13.722 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Tuesday 19 August 2025 14:21:25 -0400 (0:00:00.246) 0:05:13.968 ******** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1755627654.2147064, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1755627654.2147064, "dev": 2048, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1755627654.2147064, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "2116410301", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Tuesday 19 August 2025 14:21:27 -0400 (0:00:01.291) 0:05:15.260 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Add encryption to the volume] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:177 Tuesday 19 August 2025 14:21:27 -0400 (0:00:00.262) 0:05:15.522 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Tuesday 19 August 2025 14:21:27 -0400 (0:00:00.558) 0:05:16.080 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Tuesday 19 August 2025 14:21:28 -0400 (0:00:00.459) 0:05:16.540 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Tuesday 19 August 2025 14:21:28 -0400 (0:00:00.215) 0:05:16.755 ******** skipping: [managed-node11] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node11] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Tuesday 19 August 2025 14:21:29 -0400 (0:00:00.506) 0:05:17.261 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Tuesday 19 August 2025 14:21:29 -0400 (0:00:00.272) 0:05:17.534 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Tuesday 19 August 2025 14:21:29 -0400 (0:00:00.304) 0:05:17.839 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Tuesday 19 August 2025 14:21:29 -0400 (0:00:00.139) 0:05:17.978 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Tuesday 19 August 2025 14:21:30 -0400 (0:00:00.192) 0:05:18.171 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Tuesday 19 August 2025 14:21:31 -0400 (0:00:01.076) 0:05:19.248 ******** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Tuesday 19 August 2025 14:21:35 -0400 (0:00:04.490) 0:05:23.738 ******** ok: [managed-node11] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Tuesday 19 August 2025 14:21:35 -0400 (0:00:00.198) 0:05:23.937 ******** ok: [managed-node11] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Tuesday 19 August 2025 14:21:35 -0400 (0:00:00.191) 0:05:24.129 ******** ok: [managed-node11] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Tuesday 19 August 2025 14:21:40 -0400 (0:00:04.870) 0:05:28.999 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Tuesday 19 August 2025 14:21:41 -0400 (0:00:00.412) 0:05:29.411 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Tuesday 19 August 2025 14:21:41 -0400 (0:00:00.239) 0:05:29.651 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Tuesday 19 August 2025 14:21:41 -0400 (0:00:00.203) 0:05:29.854 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Tuesday 19 August 2025 14:21:41 -0400 (0:00:00.146) 0:05:30.001 ******** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Tuesday 19 August 2025 14:21:46 -0400 (0:00:04.596) 0:05:34.597 ******** ok: [managed-node11] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Tuesday 19 August 2025 14:21:49 -0400 (0:00:03.089) 0:05:37.686 ******** ok: [managed-node11] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Tuesday 19 August 2025 14:21:49 -0400 (0:00:00.422) 0:05:38.108 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Tuesday 19 August 2025 14:21:50 -0400 (0:00:00.202) 0:05:38.311 ******** changed: [managed-node11] => { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-f43d64bf-4d0c-4327-9e73-764d206c9f95", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-f43d64bf-4d0c-4327-9e73-764d206c9f95", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-f43d64bf-4d0c-4327-9e73-764d206c9f95", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-f43d64bf-4d0c-4327-9e73-764d206c9f95" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=e98c89cf-b21a-4f16-bd62-99809e9d3d70", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-f43d64bf-4d0c-4327-9e73-764d206c9f95", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-f43d64bf-4d0c-4327-9e73-764d206c9f95", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-f43d64bf-4d0c-4327-9e73-764d206c9f95", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Tuesday 19 August 2025 14:22:03 -0400 (0:00:13.468) 0:05:51.779 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Tuesday 19 August 2025 14:22:03 -0400 (0:00:00.201) 0:05:51.981 ******** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1755627612.2464583, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "a953e8cb0b7ddaa1b565e8823fff47639ed7b7eb", "ctime": 1755627612.2434583, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 364904586, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1755627612.2434583, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1413, "uid": 0, "version": "2238694571", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Tuesday 19 August 2025 14:22:05 -0400 (0:00:01.638) 0:05:53.619 ******** ok: [managed-node11] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Tuesday 19 August 2025 14:22:06 -0400 (0:00:01.496) 0:05:55.116 ******** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Tuesday 19 August 2025 14:22:07 -0400 (0:00:00.242) 0:05:55.358 ******** ok: [managed-node11] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-f43d64bf-4d0c-4327-9e73-764d206c9f95", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-f43d64bf-4d0c-4327-9e73-764d206c9f95", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-f43d64bf-4d0c-4327-9e73-764d206c9f95", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-f43d64bf-4d0c-4327-9e73-764d206c9f95" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=e98c89cf-b21a-4f16-bd62-99809e9d3d70", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-f43d64bf-4d0c-4327-9e73-764d206c9f95", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-f43d64bf-4d0c-4327-9e73-764d206c9f95", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-f43d64bf-4d0c-4327-9e73-764d206c9f95", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Tuesday 19 August 2025 14:22:07 -0400 (0:00:00.322) 0:05:55.681 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Tuesday 19 August 2025 14:22:07 -0400 (0:00:00.283) 0:05:55.965 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-f43d64bf-4d0c-4327-9e73-764d206c9f95", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-f43d64bf-4d0c-4327-9e73-764d206c9f95", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Tuesday 19 August 2025 14:22:08 -0400 (0:00:00.313) 0:05:56.278 ******** changed: [managed-node11] => (item={'src': 'UUID=e98c89cf-b21a-4f16-bd62-99809e9d3d70', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=e98c89cf-b21a-4f16-bd62-99809e9d3d70", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=e98c89cf-b21a-4f16-bd62-99809e9d3d70" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Tuesday 19 August 2025 14:22:09 -0400 (0:00:01.700) 0:05:57.979 ******** ok: [managed-node11] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Tuesday 19 August 2025 14:22:11 -0400 (0:00:01.806) 0:05:59.785 ******** changed: [managed-node11] => (item={'src': '/dev/mapper/luks-f43d64bf-4d0c-4327-9e73-764d206c9f95', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-f43d64bf-4d0c-4327-9e73-764d206c9f95", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-f43d64bf-4d0c-4327-9e73-764d206c9f95" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Tuesday 19 August 2025 14:22:13 -0400 (0:00:01.544) 0:06:01.330 ******** skipping: [managed-node11] => (item={'src': '/dev/mapper/luks-f43d64bf-4d0c-4327-9e73-764d206c9f95', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-f43d64bf-4d0c-4327-9e73-764d206c9f95", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Tuesday 19 August 2025 14:22:13 -0400 (0:00:00.388) 0:06:01.718 ******** ok: [managed-node11] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Tuesday 19 August 2025 14:22:15 -0400 (0:00:01.825) 0:06:03.543 ******** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1755627624.4105303, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1755627617.2284877, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 377487532, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1755627617.2264879, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "2983693730", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Tuesday 19 August 2025 14:22:16 -0400 (0:00:01.377) 0:06:04.921 ******** changed: [managed-node11] => (item={'backing_device': '/dev/sda', 'name': 'luks-f43d64bf-4d0c-4327-9e73-764d206c9f95', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-f43d64bf-4d0c-4327-9e73-764d206c9f95", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Tuesday 19 August 2025 14:22:18 -0400 (0:00:01.253) 0:06:06.174 ******** ok: [managed-node11] TASK [Verify role results - 3] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:190 Tuesday 19 August 2025 14:22:19 -0400 (0:00:01.753) 0:06:07.928 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node11 TASK [Print out pool information] ********************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Tuesday 19 August 2025 14:22:20 -0400 (0:00:00.555) 0:06:08.483 ******** skipping: [managed-node11] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Tuesday 19 August 2025 14:22:20 -0400 (0:00:00.159) 0:06:08.642 ******** ok: [managed-node11] => { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-f43d64bf-4d0c-4327-9e73-764d206c9f95", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-f43d64bf-4d0c-4327-9e73-764d206c9f95", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Tuesday 19 August 2025 14:22:20 -0400 (0:00:00.325) 0:06:08.968 ******** ok: [managed-node11] => { "changed": false, "info": { "/dev/mapper/luks-f43d64bf-4d0c-4327-9e73-764d206c9f95": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-f43d64bf-4d0c-4327-9e73-764d206c9f95", "size": "10G", "type": "crypt", "uuid": "2d7a049d-58bb-4785-ab50-9f934e37fd51" }, "/dev/sda": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "f43d64bf-4d0c-4327-9e73-764d206c9f95" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Tuesday 19 August 2025 14:22:22 -0400 (0:00:01.624) 0:06:10.593 ******** ok: [managed-node11] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002842", "end": "2025-08-19 14:22:23.609318", "rc": 0, "start": "2025-08-19 14:22:23.606476" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-f43d64bf-4d0c-4327-9e73-764d206c9f95 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Tuesday 19 August 2025 14:22:23 -0400 (0:00:01.423) 0:06:12.016 ******** ok: [managed-node11] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002248", "end": "2025-08-19 14:22:24.923471", "failed_when_result": false, "rc": 0, "start": "2025-08-19 14:22:24.921223" } STDOUT: luks-f43d64bf-4d0c-4327-9e73-764d206c9f95 /dev/sda - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Tuesday 19 August 2025 14:22:25 -0400 (0:00:01.369) 0:06:13.386 ******** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Tuesday 19 August 2025 14:22:25 -0400 (0:00:00.232) 0:06:13.618 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node11 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Tuesday 19 August 2025 14:22:25 -0400 (0:00:00.445) 0:06:14.064 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Tuesday 19 August 2025 14:22:26 -0400 (0:00:00.340) 0:06:14.404 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node11 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Tuesday 19 August 2025 14:22:27 -0400 (0:00:01.214) 0:06:15.619 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-f43d64bf-4d0c-4327-9e73-764d206c9f95" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Tuesday 19 August 2025 14:22:27 -0400 (0:00:00.291) 0:06:15.911 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Tuesday 19 August 2025 14:22:27 -0400 (0:00:00.225) 0:06:16.136 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Tuesday 19 August 2025 14:22:28 -0400 (0:00:00.283) 0:06:16.419 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Tuesday 19 August 2025 14:22:28 -0400 (0:00:00.279) 0:06:16.699 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Tuesday 19 August 2025 14:22:28 -0400 (0:00:00.313) 0:06:17.013 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Tuesday 19 August 2025 14:22:29 -0400 (0:00:00.332) 0:06:17.345 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Tuesday 19 August 2025 14:22:29 -0400 (0:00:00.323) 0:06:17.669 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Tuesday 19 August 2025 14:22:29 -0400 (0:00:00.283) 0:06:17.952 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Tuesday 19 August 2025 14:22:30 -0400 (0:00:00.284) 0:06:18.237 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Tuesday 19 August 2025 14:22:30 -0400 (0:00:00.245) 0:06:18.483 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Tuesday 19 August 2025 14:22:30 -0400 (0:00:00.193) 0:06:18.676 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-f43d64bf-4d0c-4327-9e73-764d206c9f95 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Tuesday 19 August 2025 14:22:31 -0400 (0:00:00.671) 0:06:19.348 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Tuesday 19 August 2025 14:22:31 -0400 (0:00:00.235) 0:06:19.584 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Tuesday 19 August 2025 14:22:31 -0400 (0:00:00.193) 0:06:19.777 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Tuesday 19 August 2025 14:22:31 -0400 (0:00:00.211) 0:06:19.989 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Tuesday 19 August 2025 14:22:32 -0400 (0:00:00.229) 0:06:20.218 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Tuesday 19 August 2025 14:22:32 -0400 (0:00:00.249) 0:06:20.467 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Tuesday 19 August 2025 14:22:32 -0400 (0:00:00.246) 0:06:20.714 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Tuesday 19 August 2025 14:22:32 -0400 (0:00:00.298) 0:06:21.012 ******** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1755627723.2361114, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1755627723.2361114, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 37038, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1755627723.2361114, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Tuesday 19 August 2025 14:22:33 -0400 (0:00:01.125) 0:06:22.138 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Tuesday 19 August 2025 14:22:34 -0400 (0:00:00.198) 0:06:22.337 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Tuesday 19 August 2025 14:22:34 -0400 (0:00:00.146) 0:06:22.483 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Tuesday 19 August 2025 14:22:34 -0400 (0:00:00.205) 0:06:22.689 ******** ok: [managed-node11] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Tuesday 19 August 2025 14:22:34 -0400 (0:00:00.193) 0:06:22.882 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Tuesday 19 August 2025 14:22:34 -0400 (0:00:00.199) 0:06:23.082 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Tuesday 19 August 2025 14:22:35 -0400 (0:00:00.230) 0:06:23.313 ******** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1755627723.3661122, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1755627723.3661122, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 176671, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1755627723.3661122, "nlink": 1, "path": "/dev/mapper/luks-f43d64bf-4d0c-4327-9e73-764d206c9f95", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Tuesday 19 August 2025 14:22:36 -0400 (0:00:01.502) 0:06:24.815 ******** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Tuesday 19 August 2025 14:22:40 -0400 (0:00:04.280) 0:06:29.096 ******** ok: [managed-node11] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda" ], "delta": "0:00:00.009774", "end": "2025-08-19 14:22:42.094546", "rc": 0, "start": "2025-08-19 14:22:42.084772" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: f43d64bf-4d0c-4327-9e73-764d206c9f95 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 938371 Threads: 2 Salt: 49 0a 04 74 13 54 be 49 bb d1 4a f7 39 35 17 42 8c 37 78 34 0d 61 46 73 6a b2 9e f5 e7 d5 2a 80 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 120692 Salt: 0e 7c bd 6e 59 68 96 79 88 5e 79 07 0f 12 fc 7f 17 34 85 4a 96 8c 18 ee 48 0e 72 9e d9 f0 9f 4b Digest: 4c 1a 79 e8 ae 6d a0 1f f5 1c 25 95 2b 5c 10 9c 8a a3 5f 53 fc 81 bd 63 d6 6e 42 ee 7d 96 42 96 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Tuesday 19 August 2025 14:22:42 -0400 (0:00:01.408) 0:06:30.505 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Tuesday 19 August 2025 14:22:42 -0400 (0:00:00.351) 0:06:30.856 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Tuesday 19 August 2025 14:22:43 -0400 (0:00:00.314) 0:06:31.170 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Tuesday 19 August 2025 14:22:43 -0400 (0:00:00.285) 0:06:31.456 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Tuesday 19 August 2025 14:22:43 -0400 (0:00:00.203) 0:06:31.660 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Tuesday 19 August 2025 14:22:43 -0400 (0:00:00.240) 0:06:31.900 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Tuesday 19 August 2025 14:22:43 -0400 (0:00:00.216) 0:06:32.117 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Tuesday 19 August 2025 14:22:44 -0400 (0:00:00.167) 0:06:32.285 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-f43d64bf-4d0c-4327-9e73-764d206c9f95 /dev/sda -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Tuesday 19 August 2025 14:22:44 -0400 (0:00:00.306) 0:06:32.591 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Tuesday 19 August 2025 14:22:44 -0400 (0:00:00.168) 0:06:32.760 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Tuesday 19 August 2025 14:22:44 -0400 (0:00:00.196) 0:06:32.957 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Tuesday 19 August 2025 14:22:45 -0400 (0:00:00.238) 0:06:33.195 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Tuesday 19 August 2025 14:22:45 -0400 (0:00:00.180) 0:06:33.376 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Tuesday 19 August 2025 14:22:45 -0400 (0:00:00.134) 0:06:33.510 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Tuesday 19 August 2025 14:22:45 -0400 (0:00:00.120) 0:06:33.630 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Tuesday 19 August 2025 14:22:45 -0400 (0:00:00.119) 0:06:33.750 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Tuesday 19 August 2025 14:22:45 -0400 (0:00:00.184) 0:06:33.934 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Tuesday 19 August 2025 14:22:45 -0400 (0:00:00.173) 0:06:34.108 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Tuesday 19 August 2025 14:22:46 -0400 (0:00:00.074) 0:06:34.183 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Tuesday 19 August 2025 14:22:46 -0400 (0:00:00.115) 0:06:34.299 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Tuesday 19 August 2025 14:22:46 -0400 (0:00:00.087) 0:06:34.387 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Tuesday 19 August 2025 14:22:46 -0400 (0:00:00.158) 0:06:34.545 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Tuesday 19 August 2025 14:22:46 -0400 (0:00:00.079) 0:06:34.624 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Tuesday 19 August 2025 14:22:46 -0400 (0:00:00.088) 0:06:34.712 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Tuesday 19 August 2025 14:22:46 -0400 (0:00:00.204) 0:06:34.916 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Tuesday 19 August 2025 14:22:47 -0400 (0:00:00.261) 0:06:35.177 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Tuesday 19 August 2025 14:22:47 -0400 (0:00:00.118) 0:06:35.295 ******** ok: [managed-node11] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Tuesday 19 August 2025 14:22:47 -0400 (0:00:00.166) 0:06:35.462 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Tuesday 19 August 2025 14:22:47 -0400 (0:00:00.129) 0:06:35.592 ******** skipping: [managed-node11] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Tuesday 19 August 2025 14:22:47 -0400 (0:00:00.069) 0:06:35.661 ******** skipping: [managed-node11] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Tuesday 19 August 2025 14:22:47 -0400 (0:00:00.135) 0:06:35.797 ******** skipping: [managed-node11] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Tuesday 19 August 2025 14:22:47 -0400 (0:00:00.172) 0:06:35.969 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Tuesday 19 August 2025 14:22:47 -0400 (0:00:00.119) 0:06:36.089 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Tuesday 19 August 2025 14:22:48 -0400 (0:00:00.097) 0:06:36.186 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Tuesday 19 August 2025 14:22:48 -0400 (0:00:00.149) 0:06:36.336 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Tuesday 19 August 2025 14:22:48 -0400 (0:00:00.185) 0:06:36.522 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Tuesday 19 August 2025 14:22:48 -0400 (0:00:00.129) 0:06:36.652 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Tuesday 19 August 2025 14:22:48 -0400 (0:00:00.200) 0:06:36.853 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Tuesday 19 August 2025 14:22:48 -0400 (0:00:00.266) 0:06:37.119 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Tuesday 19 August 2025 14:22:49 -0400 (0:00:00.243) 0:06:37.363 ******** skipping: [managed-node11] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Tuesday 19 August 2025 14:22:49 -0400 (0:00:00.158) 0:06:37.521 ******** skipping: [managed-node11] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Tuesday 19 August 2025 14:22:49 -0400 (0:00:00.185) 0:06:37.707 ******** skipping: [managed-node11] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Tuesday 19 August 2025 14:22:49 -0400 (0:00:00.131) 0:06:37.838 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Tuesday 19 August 2025 14:22:49 -0400 (0:00:00.165) 0:06:38.004 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Tuesday 19 August 2025 14:22:50 -0400 (0:00:00.152) 0:06:38.156 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Tuesday 19 August 2025 14:22:50 -0400 (0:00:00.166) 0:06:38.322 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Tuesday 19 August 2025 14:22:50 -0400 (0:00:00.261) 0:06:38.584 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Tuesday 19 August 2025 14:22:50 -0400 (0:00:00.191) 0:06:38.776 ******** ok: [managed-node11] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Tuesday 19 August 2025 14:22:50 -0400 (0:00:00.240) 0:06:39.016 ******** ok: [managed-node11] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Tuesday 19 August 2025 14:22:51 -0400 (0:00:00.169) 0:06:39.185 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Tuesday 19 August 2025 14:22:51 -0400 (0:00:00.148) 0:06:39.334 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Tuesday 19 August 2025 14:22:51 -0400 (0:00:00.141) 0:06:39.475 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Tuesday 19 August 2025 14:22:51 -0400 (0:00:00.208) 0:06:39.684 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Tuesday 19 August 2025 14:22:51 -0400 (0:00:00.090) 0:06:39.774 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Tuesday 19 August 2025 14:22:51 -0400 (0:00:00.213) 0:06:39.987 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Tuesday 19 August 2025 14:22:51 -0400 (0:00:00.148) 0:06:40.135 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Tuesday 19 August 2025 14:22:52 -0400 (0:00:00.169) 0:06:40.305 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Tuesday 19 August 2025 14:22:52 -0400 (0:00:00.198) 0:06:40.504 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Tuesday 19 August 2025 14:22:52 -0400 (0:00:00.234) 0:06:40.738 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Test for correct handling of new encrypted volume w/ no key - 2] ********* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:197 Tuesday 19 August 2025 14:22:52 -0400 (0:00:00.372) 0:06:41.111 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node11 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Tuesday 19 August 2025 14:22:53 -0400 (0:00:00.368) 0:06:41.479 ******** ok: [managed-node11] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Tuesday 19 August 2025 14:22:53 -0400 (0:00:00.210) 0:06:41.689 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Tuesday 19 August 2025 14:22:53 -0400 (0:00:00.331) 0:06:42.021 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Tuesday 19 August 2025 14:22:54 -0400 (0:00:00.322) 0:06:42.343 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Tuesday 19 August 2025 14:22:54 -0400 (0:00:00.416) 0:06:42.760 ******** skipping: [managed-node11] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node11] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Tuesday 19 August 2025 14:22:55 -0400 (0:00:00.573) 0:06:43.334 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Tuesday 19 August 2025 14:22:55 -0400 (0:00:00.179) 0:06:43.514 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Tuesday 19 August 2025 14:22:55 -0400 (0:00:00.228) 0:06:43.742 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Tuesday 19 August 2025 14:22:55 -0400 (0:00:00.208) 0:06:43.951 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Tuesday 19 August 2025 14:22:55 -0400 (0:00:00.191) 0:06:44.143 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Tuesday 19 August 2025 14:22:56 -0400 (0:00:00.681) 0:06:44.824 ******** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Tuesday 19 August 2025 14:23:00 -0400 (0:00:03.857) 0:06:48.681 ******** ok: [managed-node11] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Tuesday 19 August 2025 14:23:00 -0400 (0:00:00.170) 0:06:48.852 ******** ok: [managed-node11] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Tuesday 19 August 2025 14:23:00 -0400 (0:00:00.163) 0:06:49.015 ******** ok: [managed-node11] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Tuesday 19 August 2025 14:23:05 -0400 (0:00:04.649) 0:06:53.665 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Tuesday 19 August 2025 14:23:06 -0400 (0:00:00.569) 0:06:54.234 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Tuesday 19 August 2025 14:23:06 -0400 (0:00:00.183) 0:06:54.417 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Tuesday 19 August 2025 14:23:06 -0400 (0:00:00.288) 0:06:54.705 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Tuesday 19 August 2025 14:23:06 -0400 (0:00:00.155) 0:06:54.861 ******** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Tuesday 19 August 2025 14:23:11 -0400 (0:00:04.415) 0:06:59.276 ******** ok: [managed-node11] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Tuesday 19 August 2025 14:23:13 -0400 (0:00:02.827) 0:07:02.104 ******** ok: [managed-node11] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Tuesday 19 August 2025 14:23:14 -0400 (0:00:00.333) 0:07:02.437 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Tuesday 19 August 2025 14:23:14 -0400 (0:00:00.139) 0:07:02.576 ******** fatal: [managed-node11]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'test1' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Tuesday 19 August 2025 14:23:19 -0400 (0:00:05.279) 0:07:07.856 ******** fatal: [managed-node11]: FAILED! => { "changed": false } MSG: {'msg': "encrypted volume 'test1' missing key/password", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': False, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Tuesday 19 August 2025 14:23:19 -0400 (0:00:00.154) 0:07:08.010 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Tuesday 19 August 2025 14:23:20 -0400 (0:00:00.214) 0:07:08.225 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Tuesday 19 August 2025 14:23:20 -0400 (0:00:00.242) 0:07:08.468 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Tuesday 19 August 2025 14:23:21 -0400 (0:00:00.816) 0:07:09.284 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create an encrypted partition volume w/ default fs] ********************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:216 Tuesday 19 August 2025 14:23:21 -0400 (0:00:00.207) 0:07:09.491 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Tuesday 19 August 2025 14:23:22 -0400 (0:00:00.854) 0:07:10.346 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Tuesday 19 August 2025 14:23:22 -0400 (0:00:00.302) 0:07:10.649 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Tuesday 19 August 2025 14:23:22 -0400 (0:00:00.346) 0:07:10.995 ******** skipping: [managed-node11] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node11] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Tuesday 19 August 2025 14:23:23 -0400 (0:00:00.530) 0:07:11.525 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Tuesday 19 August 2025 14:23:23 -0400 (0:00:00.245) 0:07:11.771 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Tuesday 19 August 2025 14:23:23 -0400 (0:00:00.232) 0:07:12.004 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Tuesday 19 August 2025 14:23:24 -0400 (0:00:00.189) 0:07:12.193 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Tuesday 19 August 2025 14:23:24 -0400 (0:00:00.172) 0:07:12.366 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Tuesday 19 August 2025 14:23:24 -0400 (0:00:00.450) 0:07:12.816 ******** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Tuesday 19 August 2025 14:23:29 -0400 (0:00:04.715) 0:07:17.532 ******** ok: [managed-node11] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Tuesday 19 August 2025 14:23:29 -0400 (0:00:00.254) 0:07:17.786 ******** ok: [managed-node11] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Tuesday 19 August 2025 14:23:29 -0400 (0:00:00.252) 0:07:18.038 ******** ok: [managed-node11] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Tuesday 19 August 2025 14:23:35 -0400 (0:00:05.158) 0:07:23.197 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Tuesday 19 August 2025 14:23:35 -0400 (0:00:00.488) 0:07:23.685 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Tuesday 19 August 2025 14:23:35 -0400 (0:00:00.270) 0:07:23.956 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Tuesday 19 August 2025 14:23:36 -0400 (0:00:00.288) 0:07:24.244 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Tuesday 19 August 2025 14:23:36 -0400 (0:00:00.147) 0:07:24.392 ******** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Tuesday 19 August 2025 14:23:41 -0400 (0:00:05.024) 0:07:29.417 ******** ok: [managed-node11] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Tuesday 19 August 2025 14:23:43 -0400 (0:00:02.700) 0:07:32.118 ******** ok: [managed-node11] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Tuesday 19 August 2025 14:23:44 -0400 (0:00:00.420) 0:07:32.538 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Tuesday 19 August 2025 14:23:44 -0400 (0:00:00.290) 0:07:32.828 ******** changed: [managed-node11] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-f43d64bf-4d0c-4327-9e73-764d206c9f95", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-f43d64bf-4d0c-4327-9e73-764d206c9f95", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-f43d64bf-4d0c-4327-9e73-764d206c9f95", "password": "-", "state": "absent" }, { "backing_device": "/dev/sda1", "name": "luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-f43d64bf-4d0c-4327-9e73-764d206c9f95", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Tuesday 19 August 2025 14:23:58 -0400 (0:00:14.023) 0:07:46.852 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Tuesday 19 August 2025 14:23:58 -0400 (0:00:00.269) 0:07:47.121 ******** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1755627732.8561678, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "f8ac96f316832176986e794756fccc2e14c32c48", "ctime": 1755627732.8541677, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 364904586, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1755627732.8541677, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "2238694571", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Tuesday 19 August 2025 14:24:00 -0400 (0:00:01.632) 0:07:48.753 ******** ok: [managed-node11] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Tuesday 19 August 2025 14:24:02 -0400 (0:00:01.749) 0:07:50.503 ******** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Tuesday 19 August 2025 14:24:02 -0400 (0:00:00.223) 0:07:50.726 ******** ok: [managed-node11] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-f43d64bf-4d0c-4327-9e73-764d206c9f95", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-f43d64bf-4d0c-4327-9e73-764d206c9f95", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-f43d64bf-4d0c-4327-9e73-764d206c9f95", "password": "-", "state": "absent" }, { "backing_device": "/dev/sda1", "name": "luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-f43d64bf-4d0c-4327-9e73-764d206c9f95", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Tuesday 19 August 2025 14:24:02 -0400 (0:00:00.324) 0:07:51.051 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Tuesday 19 August 2025 14:24:03 -0400 (0:00:00.356) 0:07:51.407 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Tuesday 19 August 2025 14:24:03 -0400 (0:00:00.286) 0:07:51.693 ******** changed: [managed-node11] => (item={'src': '/dev/mapper/luks-f43d64bf-4d0c-4327-9e73-764d206c9f95', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-f43d64bf-4d0c-4327-9e73-764d206c9f95", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-f43d64bf-4d0c-4327-9e73-764d206c9f95" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Tuesday 19 August 2025 14:24:05 -0400 (0:00:02.066) 0:07:53.759 ******** ok: [managed-node11] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Tuesday 19 August 2025 14:24:07 -0400 (0:00:01.907) 0:07:55.667 ******** changed: [managed-node11] => (item={'src': '/dev/mapper/luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Tuesday 19 August 2025 14:24:09 -0400 (0:00:01.522) 0:07:57.189 ******** skipping: [managed-node11] => (item={'src': '/dev/mapper/luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Tuesday 19 August 2025 14:24:09 -0400 (0:00:00.286) 0:07:57.475 ******** ok: [managed-node11] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Tuesday 19 August 2025 14:24:10 -0400 (0:00:01.564) 0:07:59.040 ******** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1755627744.9222386, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "27bbc211137d6c83d364af9bfc69a9dd16903304", "ctime": 1755627737.7081962, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 505413767, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1755627737.7061963, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 53, "uid": 0, "version": "888062680", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Tuesday 19 August 2025 14:24:12 -0400 (0:00:01.570) 0:08:00.611 ******** changed: [managed-node11] => (item={'backing_device': '/dev/sda', 'name': 'luks-f43d64bf-4d0c-4327-9e73-764d206c9f95', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-f43d64bf-4d0c-4327-9e73-764d206c9f95", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed changed: [managed-node11] => (item={'backing_device': '/dev/sda1', 'name': 'luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Tuesday 19 August 2025 14:24:15 -0400 (0:00:02.842) 0:08:03.453 ******** ok: [managed-node11] TASK [Verify role results - 4] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:233 Tuesday 19 August 2025 14:24:17 -0400 (0:00:02.128) 0:08:05.582 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node11 TASK [Print out pool information] ********************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Tuesday 19 August 2025 14:24:18 -0400 (0:00:00.710) 0:08:06.293 ******** ok: [managed-node11] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Tuesday 19 August 2025 14:24:18 -0400 (0:00:00.260) 0:08:06.553 ******** skipping: [managed-node11] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Tuesday 19 August 2025 14:24:18 -0400 (0:00:00.280) 0:08:06.833 ******** ok: [managed-node11] => { "changed": false, "info": { "/dev/mapper/luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f", "size": "4G", "type": "crypt", "uuid": "852491f4-1006-495d-badd-ed418ab67544" }, "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda1", "size": "4G", "type": "partition", "uuid": "7d1089ca-0f6b-48fe-9a70-e7c81f42081f" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Tuesday 19 August 2025 14:24:20 -0400 (0:00:01.552) 0:08:08.386 ******** ok: [managed-node11] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002150", "end": "2025-08-19 14:24:21.353310", "rc": 0, "start": "2025-08-19 14:24:21.351160" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Tuesday 19 August 2025 14:24:21 -0400 (0:00:01.484) 0:08:09.871 ******** ok: [managed-node11] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002128", "end": "2025-08-19 14:24:23.230959", "failed_when_result": false, "rc": 0, "start": "2025-08-19 14:24:23.228831" } STDOUT: luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f /dev/sda1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Tuesday 19 August 2025 14:24:23 -0400 (0:00:01.797) 0:08:11.668 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node11 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Tuesday 19 August 2025 14:24:23 -0400 (0:00:00.452) 0:08:12.120 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Tuesday 19 August 2025 14:24:24 -0400 (0:00:00.254) 0:08:12.375 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Tuesday 19 August 2025 14:24:24 -0400 (0:00:00.305) 0:08:12.681 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Tuesday 19 August 2025 14:24:24 -0400 (0:00:00.161) 0:08:12.842 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node11 TASK [Set test variables] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Tuesday 19 August 2025 14:24:25 -0400 (0:00:00.580) 0:08:13.423 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Tuesday 19 August 2025 14:24:25 -0400 (0:00:00.202) 0:08:13.625 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Tuesday 19 August 2025 14:24:25 -0400 (0:00:00.216) 0:08:13.841 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Tuesday 19 August 2025 14:24:25 -0400 (0:00:00.208) 0:08:14.050 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Tuesday 19 August 2025 14:24:26 -0400 (0:00:00.237) 0:08:14.287 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Tuesday 19 August 2025 14:24:26 -0400 (0:00:00.191) 0:08:14.479 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Tuesday 19 August 2025 14:24:26 -0400 (0:00:00.182) 0:08:14.662 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Tuesday 19 August 2025 14:24:26 -0400 (0:00:00.202) 0:08:14.865 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Tuesday 19 August 2025 14:24:26 -0400 (0:00:00.171) 0:08:15.037 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Tuesday 19 August 2025 14:24:27 -0400 (0:00:00.213) 0:08:15.250 ******** ok: [managed-node11] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.43.133 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Tuesday 19 August 2025 14:24:28 -0400 (0:00:01.041) 0:08:16.291 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Tuesday 19 August 2025 14:24:28 -0400 (0:00:00.121) 0:08:16.413 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node11 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Tuesday 19 August 2025 14:24:28 -0400 (0:00:00.281) 0:08:16.694 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Tuesday 19 August 2025 14:24:28 -0400 (0:00:00.101) 0:08:16.796 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Tuesday 19 August 2025 14:24:28 -0400 (0:00:00.190) 0:08:16.987 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Tuesday 19 August 2025 14:24:29 -0400 (0:00:00.277) 0:08:17.264 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Tuesday 19 August 2025 14:24:29 -0400 (0:00:00.207) 0:08:17.472 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Tuesday 19 August 2025 14:24:29 -0400 (0:00:00.263) 0:08:17.736 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Tuesday 19 August 2025 14:24:29 -0400 (0:00:00.212) 0:08:17.948 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Tuesday 19 August 2025 14:24:29 -0400 (0:00:00.153) 0:08:18.102 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Tuesday 19 August 2025 14:24:30 -0400 (0:00:00.157) 0:08:18.260 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Tuesday 19 August 2025 14:24:30 -0400 (0:00:00.186) 0:08:18.446 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Tuesday 19 August 2025 14:24:30 -0400 (0:00:00.142) 0:08:18.588 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Tuesday 19 August 2025 14:24:30 -0400 (0:00:00.198) 0:08:18.786 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node11 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Tuesday 19 August 2025 14:24:31 -0400 (0:00:00.609) 0:08:19.396 ******** skipping: [managed-node11] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/mapper/luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Tuesday 19 August 2025 14:24:31 -0400 (0:00:00.298) 0:08:19.694 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node11 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Tuesday 19 August 2025 14:24:32 -0400 (0:00:00.920) 0:08:20.615 ******** skipping: [managed-node11] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/mapper/luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Tuesday 19 August 2025 14:24:32 -0400 (0:00:00.303) 0:08:20.919 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node11 TASK [Set test variables] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Tuesday 19 August 2025 14:24:33 -0400 (0:00:00.574) 0:08:21.493 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Tuesday 19 August 2025 14:24:33 -0400 (0:00:00.275) 0:08:21.768 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Tuesday 19 August 2025 14:24:33 -0400 (0:00:00.294) 0:08:22.062 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Tuesday 19 August 2025 14:24:34 -0400 (0:00:00.297) 0:08:22.360 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Tuesday 19 August 2025 14:24:34 -0400 (0:00:00.200) 0:08:22.561 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node11 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Tuesday 19 August 2025 14:24:35 -0400 (0:00:00.610) 0:08:23.172 ******** skipping: [managed-node11] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/mapper/luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Tuesday 19 August 2025 14:24:35 -0400 (0:00:00.277) 0:08:23.449 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node11 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Tuesday 19 August 2025 14:24:35 -0400 (0:00:00.625) 0:08:24.075 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Tuesday 19 August 2025 14:24:36 -0400 (0:00:00.295) 0:08:24.370 ******** skipping: [managed-node11] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Tuesday 19 August 2025 14:24:36 -0400 (0:00:00.230) 0:08:24.601 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Tuesday 19 August 2025 14:24:36 -0400 (0:00:00.269) 0:08:24.871 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Tuesday 19 August 2025 14:24:36 -0400 (0:00:00.229) 0:08:25.100 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Tuesday 19 August 2025 14:24:37 -0400 (0:00:00.223) 0:08:25.324 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Tuesday 19 August 2025 14:24:37 -0400 (0:00:00.121) 0:08:25.445 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Tuesday 19 August 2025 14:24:37 -0400 (0:00:00.163) 0:08:25.609 ******** ok: [managed-node11] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Tuesday 19 August 2025 14:24:37 -0400 (0:00:00.161) 0:08:25.770 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node11 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Tuesday 19 August 2025 14:24:37 -0400 (0:00:00.344) 0:08:26.115 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Tuesday 19 August 2025 14:24:38 -0400 (0:00:00.155) 0:08:26.270 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node11 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Tuesday 19 August 2025 14:24:38 -0400 (0:00:00.707) 0:08:26.977 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Tuesday 19 August 2025 14:24:39 -0400 (0:00:00.325) 0:08:27.303 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Tuesday 19 August 2025 14:24:39 -0400 (0:00:00.258) 0:08:27.562 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Tuesday 19 August 2025 14:24:39 -0400 (0:00:00.418) 0:08:27.980 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Tuesday 19 August 2025 14:24:40 -0400 (0:00:00.262) 0:08:28.243 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Tuesday 19 August 2025 14:24:40 -0400 (0:00:00.262) 0:08:28.506 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Tuesday 19 August 2025 14:24:40 -0400 (0:00:00.325) 0:08:28.831 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Tuesday 19 August 2025 14:24:40 -0400 (0:00:00.256) 0:08:29.088 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Tuesday 19 August 2025 14:24:41 -0400 (0:00:00.313) 0:08:29.401 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Tuesday 19 August 2025 14:24:41 -0400 (0:00:00.281) 0:08:29.683 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Tuesday 19 August 2025 14:24:41 -0400 (0:00:00.198) 0:08:29.882 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Tuesday 19 August 2025 14:24:41 -0400 (0:00:00.200) 0:08:30.082 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Tuesday 19 August 2025 14:24:42 -0400 (0:00:00.498) 0:08:30.581 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Tuesday 19 August 2025 14:24:42 -0400 (0:00:00.298) 0:08:30.880 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Tuesday 19 August 2025 14:24:42 -0400 (0:00:00.238) 0:08:31.119 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Tuesday 19 August 2025 14:24:43 -0400 (0:00:00.692) 0:08:31.811 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Tuesday 19 August 2025 14:24:43 -0400 (0:00:00.290) 0:08:32.102 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Tuesday 19 August 2025 14:24:44 -0400 (0:00:00.247) 0:08:32.350 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Tuesday 19 August 2025 14:24:44 -0400 (0:00:00.289) 0:08:32.639 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Tuesday 19 August 2025 14:24:44 -0400 (0:00:00.383) 0:08:33.023 ******** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1755627838.235786, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1755627838.235786, "dev": 6, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 189882, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1755627838.235786, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Tuesday 19 August 2025 14:24:46 -0400 (0:00:01.445) 0:08:34.468 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Tuesday 19 August 2025 14:24:46 -0400 (0:00:00.284) 0:08:34.753 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Tuesday 19 August 2025 14:24:46 -0400 (0:00:00.202) 0:08:34.955 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Tuesday 19 August 2025 14:24:47 -0400 (0:00:00.280) 0:08:35.236 ******** ok: [managed-node11] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Tuesday 19 August 2025 14:24:47 -0400 (0:00:00.170) 0:08:35.406 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Tuesday 19 August 2025 14:24:47 -0400 (0:00:00.191) 0:08:35.598 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Tuesday 19 August 2025 14:24:47 -0400 (0:00:00.184) 0:08:35.782 ******** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1755627838.3797867, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1755627838.3797867, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 189949, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1755627838.3797867, "nlink": 1, "path": "/dev/mapper/luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Tuesday 19 August 2025 14:24:49 -0400 (0:00:01.859) 0:08:37.641 ******** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Tuesday 19 August 2025 14:24:54 -0400 (0:00:04.728) 0:08:42.370 ******** ok: [managed-node11] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda1" ], "delta": "0:00:00.009710", "end": "2025-08-19 14:24:55.651609", "rc": 0, "start": "2025-08-19 14:24:55.641899" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 7d1089ca-0f6b-48fe-9a70-e7c81f42081f Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 948582 Threads: 2 Salt: 81 72 57 de 66 0e ab c9 af 82 5b 94 7d d3 12 e9 af 2e ec 4a f4 22 b1 cc fa ef b5 c2 74 0a dd 24 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 119591 Salt: e6 94 f5 b8 80 dd 5b 15 64 ec 55 e9 e4 34 28 6e 22 63 04 dc 2a 41 12 8f 6b fa 5e 2d fa 27 00 d2 Digest: a2 38 06 67 9a 29 dd 79 d0 91 14 79 86 a7 e7 43 44 dd 90 84 03 11 6e 64 ef a7 63 4e 8b 87 78 d7 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Tuesday 19 August 2025 14:24:55 -0400 (0:00:01.687) 0:08:44.057 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Tuesday 19 August 2025 14:24:56 -0400 (0:00:00.359) 0:08:44.417 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Tuesday 19 August 2025 14:24:56 -0400 (0:00:00.238) 0:08:44.655 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Tuesday 19 August 2025 14:24:56 -0400 (0:00:00.128) 0:08:44.784 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Tuesday 19 August 2025 14:24:56 -0400 (0:00:00.146) 0:08:44.930 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Tuesday 19 August 2025 14:24:56 -0400 (0:00:00.198) 0:08:45.129 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Tuesday 19 August 2025 14:24:57 -0400 (0:00:00.241) 0:08:45.371 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Tuesday 19 August 2025 14:24:57 -0400 (0:00:00.163) 0:08:45.534 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f /dev/sda1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Tuesday 19 August 2025 14:24:57 -0400 (0:00:00.153) 0:08:45.687 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Tuesday 19 August 2025 14:24:57 -0400 (0:00:00.179) 0:08:45.866 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Tuesday 19 August 2025 14:24:57 -0400 (0:00:00.247) 0:08:46.114 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Tuesday 19 August 2025 14:24:58 -0400 (0:00:00.255) 0:08:46.370 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Tuesday 19 August 2025 14:24:58 -0400 (0:00:00.170) 0:08:46.540 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Tuesday 19 August 2025 14:24:58 -0400 (0:00:00.086) 0:08:46.627 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Tuesday 19 August 2025 14:24:58 -0400 (0:00:00.156) 0:08:46.784 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Tuesday 19 August 2025 14:24:58 -0400 (0:00:00.166) 0:08:46.950 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Tuesday 19 August 2025 14:24:59 -0400 (0:00:00.283) 0:08:47.234 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Tuesday 19 August 2025 14:24:59 -0400 (0:00:00.305) 0:08:47.539 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Tuesday 19 August 2025 14:24:59 -0400 (0:00:00.263) 0:08:47.803 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Tuesday 19 August 2025 14:24:59 -0400 (0:00:00.315) 0:08:48.119 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Tuesday 19 August 2025 14:25:00 -0400 (0:00:00.246) 0:08:48.366 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Tuesday 19 August 2025 14:25:00 -0400 (0:00:00.280) 0:08:48.646 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Tuesday 19 August 2025 14:25:00 -0400 (0:00:00.192) 0:08:48.839 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Tuesday 19 August 2025 14:25:00 -0400 (0:00:00.197) 0:08:49.037 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Tuesday 19 August 2025 14:25:01 -0400 (0:00:00.310) 0:08:49.348 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Tuesday 19 August 2025 14:25:01 -0400 (0:00:00.340) 0:08:49.689 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Tuesday 19 August 2025 14:25:01 -0400 (0:00:00.306) 0:08:49.995 ******** ok: [managed-node11] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Tuesday 19 August 2025 14:25:02 -0400 (0:00:00.151) 0:08:50.146 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Tuesday 19 August 2025 14:25:02 -0400 (0:00:00.276) 0:08:50.422 ******** skipping: [managed-node11] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Tuesday 19 August 2025 14:25:02 -0400 (0:00:00.272) 0:08:50.695 ******** skipping: [managed-node11] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Tuesday 19 August 2025 14:25:02 -0400 (0:00:00.252) 0:08:50.947 ******** skipping: [managed-node11] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Tuesday 19 August 2025 14:25:03 -0400 (0:00:00.216) 0:08:51.164 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Tuesday 19 August 2025 14:25:03 -0400 (0:00:00.313) 0:08:51.478 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Tuesday 19 August 2025 14:25:03 -0400 (0:00:00.236) 0:08:51.714 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Tuesday 19 August 2025 14:25:03 -0400 (0:00:00.358) 0:08:52.072 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Tuesday 19 August 2025 14:25:04 -0400 (0:00:00.315) 0:08:52.388 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Tuesday 19 August 2025 14:25:04 -0400 (0:00:00.183) 0:08:52.572 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Tuesday 19 August 2025 14:25:04 -0400 (0:00:00.324) 0:08:52.896 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Tuesday 19 August 2025 14:25:04 -0400 (0:00:00.172) 0:08:53.068 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Tuesday 19 August 2025 14:25:05 -0400 (0:00:00.264) 0:08:53.333 ******** skipping: [managed-node11] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Tuesday 19 August 2025 14:25:05 -0400 (0:00:00.211) 0:08:53.544 ******** skipping: [managed-node11] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Tuesday 19 August 2025 14:25:05 -0400 (0:00:00.103) 0:08:53.647 ******** skipping: [managed-node11] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Tuesday 19 August 2025 14:25:05 -0400 (0:00:00.117) 0:08:53.765 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Tuesday 19 August 2025 14:25:06 -0400 (0:00:00.424) 0:08:54.190 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Tuesday 19 August 2025 14:25:06 -0400 (0:00:00.223) 0:08:54.413 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Tuesday 19 August 2025 14:25:06 -0400 (0:00:00.204) 0:08:54.617 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Tuesday 19 August 2025 14:25:06 -0400 (0:00:00.234) 0:08:54.852 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Tuesday 19 August 2025 14:25:06 -0400 (0:00:00.238) 0:08:55.090 ******** ok: [managed-node11] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Tuesday 19 August 2025 14:25:07 -0400 (0:00:00.252) 0:08:55.343 ******** ok: [managed-node11] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Tuesday 19 August 2025 14:25:07 -0400 (0:00:00.166) 0:08:55.510 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Tuesday 19 August 2025 14:25:07 -0400 (0:00:00.224) 0:08:55.734 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Tuesday 19 August 2025 14:25:07 -0400 (0:00:00.214) 0:08:55.948 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Tuesday 19 August 2025 14:25:08 -0400 (0:00:00.213) 0:08:56.162 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Tuesday 19 August 2025 14:25:08 -0400 (0:00:00.260) 0:08:56.422 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Tuesday 19 August 2025 14:25:08 -0400 (0:00:00.210) 0:08:56.632 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Tuesday 19 August 2025 14:25:08 -0400 (0:00:00.308) 0:08:56.941 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Tuesday 19 August 2025 14:25:09 -0400 (0:00:00.281) 0:08:57.222 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Tuesday 19 August 2025 14:25:09 -0400 (0:00:00.353) 0:08:57.576 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Tuesday 19 August 2025 14:25:09 -0400 (0:00:00.258) 0:08:57.834 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Tuesday 19 August 2025 14:25:09 -0400 (0:00:00.225) 0:08:58.060 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Tuesday 19 August 2025 14:25:10 -0400 (0:00:00.213) 0:08:58.274 ******** changed: [managed-node11] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 3] ****************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:239 Tuesday 19 August 2025 14:25:11 -0400 (0:00:01.558) 0:08:59.832 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node11 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Tuesday 19 August 2025 14:25:12 -0400 (0:00:00.612) 0:09:00.445 ******** ok: [managed-node11] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Tuesday 19 August 2025 14:25:12 -0400 (0:00:00.301) 0:09:00.747 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Tuesday 19 August 2025 14:25:12 -0400 (0:00:00.370) 0:09:01.118 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Tuesday 19 August 2025 14:25:13 -0400 (0:00:00.347) 0:09:01.465 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Tuesday 19 August 2025 14:25:13 -0400 (0:00:00.243) 0:09:01.709 ******** skipping: [managed-node11] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node11] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Tuesday 19 August 2025 14:25:14 -0400 (0:00:00.579) 0:09:02.288 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Tuesday 19 August 2025 14:25:14 -0400 (0:00:00.244) 0:09:02.533 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Tuesday 19 August 2025 14:25:14 -0400 (0:00:00.208) 0:09:02.741 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Tuesday 19 August 2025 14:25:14 -0400 (0:00:00.232) 0:09:02.974 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Tuesday 19 August 2025 14:25:15 -0400 (0:00:00.219) 0:09:03.193 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Tuesday 19 August 2025 14:25:15 -0400 (0:00:00.385) 0:09:03.579 ******** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Tuesday 19 August 2025 14:25:20 -0400 (0:00:04.601) 0:09:08.181 ******** ok: [managed-node11] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Tuesday 19 August 2025 14:25:20 -0400 (0:00:00.395) 0:09:08.576 ******** ok: [managed-node11] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Tuesday 19 August 2025 14:25:20 -0400 (0:00:00.357) 0:09:08.934 ******** ok: [managed-node11] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Tuesday 19 August 2025 14:25:26 -0400 (0:00:05.288) 0:09:14.222 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Tuesday 19 August 2025 14:25:26 -0400 (0:00:00.376) 0:09:14.599 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Tuesday 19 August 2025 14:25:26 -0400 (0:00:00.239) 0:09:14.839 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Tuesday 19 August 2025 14:25:26 -0400 (0:00:00.267) 0:09:15.106 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Tuesday 19 August 2025 14:25:27 -0400 (0:00:00.140) 0:09:15.246 ******** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Tuesday 19 August 2025 14:25:31 -0400 (0:00:04.802) 0:09:20.049 ******** ok: [managed-node11] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...d4d0c\\x2d4327\\x2d9e73\\x2d764d206c9f95.service": { "name": "systemd-cryptsetup@luk...d4d0c\\x2d4327\\x2d9e73\\x2d764d206c9f95.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2df43d64bf\\x2d4d0c\\x2d4327\\x2d9e73\\x2d764d206c9f95.service": { "name": "systemd-cryptsetup@luks\\x2df43d64bf\\x2d4d0c\\x2d4327\\x2d9e73\\x2d764d206c9f95.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Tuesday 19 August 2025 14:25:34 -0400 (0:00:02.537) 0:09:22.587 ******** ok: [managed-node11] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2df43d64bf\\x2d4d0c\\x2d4327\\x2d9e73\\x2d764d206c9f95.service", "systemd-cryptsetup@luk...d4d0c\\x2d4327\\x2d9e73\\x2d764d206c9f95.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Tuesday 19 August 2025 14:25:34 -0400 (0:00:00.376) 0:09:22.964 ******** changed: [managed-node11] => (item=systemd-cryptsetup@luks\x2df43d64bf\x2d4d0c\x2d4327\x2d9e73\x2d764d206c9f95.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2df43d64bf\\x2d4d0c\\x2d4327\\x2d9e73\\x2d764d206c9f95.service", "name": "systemd-cryptsetup@luks\\x2df43d64bf\\x2d4d0c\\x2d4327\\x2d9e73\\x2d764d206c9f95.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "cryptsetup-pre.target dev-sda.device systemd-journald.socket system-systemd\\x2dcryptsetup.slice", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-sda.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-f43d64bf-4d0c-4327-9e73-764d206c9f95", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-f43d64bf-4d0c-4327-9e73-764d206c9f95 /dev/sda - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-f43d64bf-4d0c-4327-9e73-764d206c9f95 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2df43d64bf\\x2d4d0c\\x2d4327\\x2d9e73\\x2d764d206c9f95.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2df43d64bf\\x2d4d0c\\x2d4327\\x2d9e73\\x2d764d206c9f95.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2df43d64bf\\x2d4d0c\\x2d4327\\x2d9e73\\x2d764d206c9f95.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Tue 2025-08-19 14:24:10 EDT", "StateChangeTimestampMonotonic": "2167662663", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-sda.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node11] => (item=systemd-cryptsetup@luk...d4d0c\x2d4327\x2d9e73\x2d764d206c9f95.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d4d0c\\x2d4327\\x2d9e73\\x2d764d206c9f95.service", "name": "systemd-cryptsetup@luk...d4d0c\\x2d4327\\x2d9e73\\x2d764d206c9f95.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d4d0c\\x2d4327\\x2d9e73\\x2d764d206c9f95.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d4d0c\\x2d4327\\x2d9e73\\x2d764d206c9f95.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...d4d0c\\x2d4327\\x2d9e73\\x2d764d206c9f95.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d4d0c\\x2d4327\\x2d9e73\\x2d764d206c9f95.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Tuesday 19 August 2025 14:25:38 -0400 (0:00:03.393) 0:09:26.357 ******** fatal: [managed-node11]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Tuesday 19 August 2025 14:25:43 -0400 (0:00:05.456) 0:09:31.814 ******** fatal: [managed-node11]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f' in safe mode due to encryption removal", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Tuesday 19 August 2025 14:25:43 -0400 (0:00:00.252) 0:09:32.066 ******** changed: [managed-node11] => (item=systemd-cryptsetup@luks\x2df43d64bf\x2d4d0c\x2d4327\x2d9e73\x2d764d206c9f95.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2df43d64bf\\x2d4d0c\\x2d4327\\x2d9e73\\x2d764d206c9f95.service", "name": "systemd-cryptsetup@luks\\x2df43d64bf\\x2d4d0c\\x2d4327\\x2d9e73\\x2d764d206c9f95.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2df43d64bf\\x2d4d0c\\x2d4327\\x2d9e73\\x2d764d206c9f95.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2df43d64bf\\x2d4d0c\\x2d4327\\x2d9e73\\x2d764d206c9f95.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2df43d64bf\\x2d4d0c\\x2d4327\\x2d9e73\\x2d764d206c9f95.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2df43d64bf\\x2d4d0c\\x2d4327\\x2d9e73\\x2d764d206c9f95.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node11] => (item=systemd-cryptsetup@luk...d4d0c\x2d4327\x2d9e73\x2d764d206c9f95.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d4d0c\\x2d4327\\x2d9e73\\x2d764d206c9f95.service", "name": "systemd-cryptsetup@luk...d4d0c\\x2d4327\\x2d9e73\\x2d764d206c9f95.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d4d0c\\x2d4327\\x2d9e73\\x2d764d206c9f95.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d4d0c\\x2d4327\\x2d9e73\\x2d764d206c9f95.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...d4d0c\\x2d4327\\x2d9e73\\x2d764d206c9f95.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d4d0c\\x2d4327\\x2d9e73\\x2d764d206c9f95.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Tuesday 19 August 2025 14:25:47 -0400 (0:00:03.223) 0:09:35.290 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Tuesday 19 August 2025 14:25:47 -0400 (0:00:00.203) 0:09:35.494 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Tuesday 19 August 2025 14:25:47 -0400 (0:00:00.124) 0:09:35.618 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Tuesday 19 August 2025 14:25:47 -0400 (0:00:00.107) 0:09:35.726 ******** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1755627911.3852148, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1755627911.3852148, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1755627911.3852148, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "18043664", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Tuesday 19 August 2025 14:25:49 -0400 (0:00:01.489) 0:09:37.215 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer - 2] ***************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:263 Tuesday 19 August 2025 14:25:49 -0400 (0:00:00.321) 0:09:37.536 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Tuesday 19 August 2025 14:25:50 -0400 (0:00:01.362) 0:09:38.899 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Tuesday 19 August 2025 14:25:51 -0400 (0:00:00.401) 0:09:39.300 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Tuesday 19 August 2025 14:25:51 -0400 (0:00:00.282) 0:09:39.583 ******** skipping: [managed-node11] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node11] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Tuesday 19 August 2025 14:25:52 -0400 (0:00:00.569) 0:09:40.152 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Tuesday 19 August 2025 14:25:52 -0400 (0:00:00.277) 0:09:40.429 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Tuesday 19 August 2025 14:25:52 -0400 (0:00:00.127) 0:09:40.557 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Tuesday 19 August 2025 14:25:52 -0400 (0:00:00.168) 0:09:40.725 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Tuesday 19 August 2025 14:25:52 -0400 (0:00:00.171) 0:09:40.896 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Tuesday 19 August 2025 14:25:53 -0400 (0:00:00.477) 0:09:41.373 ******** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Tuesday 19 August 2025 14:25:57 -0400 (0:00:04.275) 0:09:45.649 ******** ok: [managed-node11] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Tuesday 19 August 2025 14:25:57 -0400 (0:00:00.243) 0:09:45.892 ******** ok: [managed-node11] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Tuesday 19 August 2025 14:25:57 -0400 (0:00:00.245) 0:09:46.137 ******** ok: [managed-node11] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Tuesday 19 August 2025 14:26:03 -0400 (0:00:05.355) 0:09:51.493 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Tuesday 19 August 2025 14:26:03 -0400 (0:00:00.443) 0:09:51.936 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Tuesday 19 August 2025 14:26:04 -0400 (0:00:00.277) 0:09:52.214 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Tuesday 19 August 2025 14:26:04 -0400 (0:00:00.262) 0:09:52.476 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Tuesday 19 August 2025 14:26:04 -0400 (0:00:00.217) 0:09:52.694 ******** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Tuesday 19 August 2025 14:26:08 -0400 (0:00:04.434) 0:09:57.128 ******** ok: [managed-node11] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.service": { "name": "systemd-cryptsetup@luk...d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2d7d1089ca\\x2d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.service": { "name": "systemd-cryptsetup@luks\\x2d7d1089ca\\x2d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Tuesday 19 August 2025 14:26:11 -0400 (0:00:02.593) 0:09:59.721 ******** ok: [managed-node11] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d7d1089ca\\x2d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.service", "systemd-cryptsetup@luk...d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Tuesday 19 August 2025 14:26:11 -0400 (0:00:00.257) 0:09:59.979 ******** changed: [managed-node11] => (item=systemd-cryptsetup@luks\x2d7d1089ca\x2d0f6b\x2d48fe\x2d9a70\x2de7c81f42081f.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d7d1089ca\\x2d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.service", "name": "systemd-cryptsetup@luks\\x2d7d1089ca\\x2d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "system-systemd\\x2dcryptsetup.slice cryptsetup-pre.target systemd-journald.socket dev-sda1.device", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f /dev/sda1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d7d1089ca\\x2d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d7d1089ca\\x2d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d7d1089ca\\x2d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "dev-mapper-luks\\x2d7d1089ca\\x2d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.device cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Tue 2025-08-19 14:25:37 EDT", "StateChangeTimestampMonotonic": "2254836319", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node11] => (item=systemd-cryptsetup@luk...d0f6b\x2d48fe\x2d9a70\x2de7c81f42081f.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.service", "name": "systemd-cryptsetup@luk...d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Tuesday 19 August 2025 14:26:15 -0400 (0:00:03.311) 0:10:03.290 ******** changed: [managed-node11] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=60d88914-fbf4-4327-a0d6-a90fc64917ca", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=60d88914-fbf4-4327-a0d6-a90fc64917ca", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Tuesday 19 August 2025 14:26:20 -0400 (0:00:05.847) 0:10:09.138 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Tuesday 19 August 2025 14:26:21 -0400 (0:00:00.230) 0:10:09.369 ******** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1755627848.631847, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "0a23b66cfcd2970d29d50e2a10c11d4f0cdef573", "ctime": 1755627848.628847, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 364904586, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1755627848.628847, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "2238694571", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Tuesday 19 August 2025 14:26:22 -0400 (0:00:01.256) 0:10:10.625 ******** ok: [managed-node11] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Tuesday 19 August 2025 14:26:23 -0400 (0:00:01.501) 0:10:12.126 ******** changed: [managed-node11] => (item=systemd-cryptsetup@luks\x2d7d1089ca\x2d0f6b\x2d48fe\x2d9a70\x2de7c81f42081f.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d7d1089ca\\x2d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.service", "name": "systemd-cryptsetup@luks\\x2d7d1089ca\\x2d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d7d1089ca\\x2d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d7d1089ca\\x2d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d7d1089ca\\x2d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d7d1089ca\\x2d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "RequiredBy": "dev-mapper-luks\\x2d7d1089ca\\x2d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.device cryptsetup.target", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Tue 2025-08-19 14:25:37 EDT", "StateChangeTimestampMonotonic": "2254836319", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node11] => (item=systemd-cryptsetup@luk...d0f6b\x2d48fe\x2d9a70\x2de7c81f42081f.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.service", "name": "systemd-cryptsetup@luk...d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Tuesday 19 August 2025 14:26:27 -0400 (0:00:03.517) 0:10:15.644 ******** ok: [managed-node11] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=60d88914-fbf4-4327-a0d6-a90fc64917ca", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=60d88914-fbf4-4327-a0d6-a90fc64917ca", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Tuesday 19 August 2025 14:26:27 -0400 (0:00:00.309) 0:10:15.954 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=60d88914-fbf4-4327-a0d6-a90fc64917ca", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Tuesday 19 August 2025 14:26:28 -0400 (0:00:00.282) 0:10:16.236 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Tuesday 19 August 2025 14:26:28 -0400 (0:00:00.232) 0:10:16.469 ******** changed: [managed-node11] => (item={'src': '/dev/mapper/luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Tuesday 19 August 2025 14:26:30 -0400 (0:00:01.973) 0:10:18.442 ******** ok: [managed-node11] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Tuesday 19 August 2025 14:26:32 -0400 (0:00:01.940) 0:10:20.383 ******** changed: [managed-node11] => (item={'src': 'UUID=60d88914-fbf4-4327-a0d6-a90fc64917ca', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=60d88914-fbf4-4327-a0d6-a90fc64917ca", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=60d88914-fbf4-4327-a0d6-a90fc64917ca" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Tuesday 19 August 2025 14:26:33 -0400 (0:00:01.583) 0:10:21.967 ******** skipping: [managed-node11] => (item={'src': 'UUID=60d88914-fbf4-4327-a0d6-a90fc64917ca', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=60d88914-fbf4-4327-a0d6-a90fc64917ca", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Tuesday 19 August 2025 14:26:34 -0400 (0:00:00.383) 0:10:22.350 ******** ok: [managed-node11] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Tuesday 19 August 2025 14:26:35 -0400 (0:00:01.668) 0:10:24.019 ******** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1755627863.2299325, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "d2ad8c370e68e8debb7a8929275a3cdf8ee22944", "ctime": 1755627855.0588846, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 94372038, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1755627855.0578847, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 54, "uid": 0, "version": "1058182415", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Tuesday 19 August 2025 14:26:37 -0400 (0:00:01.582) 0:10:25.601 ******** changed: [managed-node11] => (item={'backing_device': '/dev/sda1', 'name': 'luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Tuesday 19 August 2025 14:26:39 -0400 (0:00:01.685) 0:10:27.287 ******** ok: [managed-node11] TASK [Verify role results - 5] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:280 Tuesday 19 August 2025 14:26:40 -0400 (0:00:01.793) 0:10:29.080 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node11 TASK [Print out pool information] ********************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Tuesday 19 August 2025 14:26:41 -0400 (0:00:00.708) 0:10:29.789 ******** ok: [managed-node11] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=60d88914-fbf4-4327-a0d6-a90fc64917ca", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Tuesday 19 August 2025 14:26:41 -0400 (0:00:00.256) 0:10:30.046 ******** skipping: [managed-node11] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Tuesday 19 August 2025 14:26:42 -0400 (0:00:00.520) 0:10:30.566 ******** ok: [managed-node11] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/sda1", "size": "4G", "type": "partition", "uuid": "60d88914-fbf4-4327-a0d6-a90fc64917ca" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Tuesday 19 August 2025 14:26:43 -0400 (0:00:01.483) 0:10:32.049 ******** ok: [managed-node11] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002128", "end": "2025-08-19 14:26:44.966287", "rc": 0, "start": "2025-08-19 14:26:44.964159" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 UUID=60d88914-fbf4-4327-a0d6-a90fc64917ca /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Tuesday 19 August 2025 14:26:45 -0400 (0:00:01.342) 0:10:33.392 ******** ok: [managed-node11] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002227", "end": "2025-08-19 14:26:46.437994", "failed_when_result": false, "rc": 0, "start": "2025-08-19 14:26:46.435767" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Tuesday 19 August 2025 14:26:46 -0400 (0:00:01.503) 0:10:34.895 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node11 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Tuesday 19 August 2025 14:26:47 -0400 (0:00:00.333) 0:10:35.229 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Tuesday 19 August 2025 14:26:47 -0400 (0:00:00.189) 0:10:35.418 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Tuesday 19 August 2025 14:26:47 -0400 (0:00:00.431) 0:10:35.850 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Tuesday 19 August 2025 14:26:47 -0400 (0:00:00.253) 0:10:36.103 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node11 TASK [Set test variables] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Tuesday 19 August 2025 14:26:48 -0400 (0:00:00.440) 0:10:36.543 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Tuesday 19 August 2025 14:26:48 -0400 (0:00:00.221) 0:10:36.764 ******** TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Tuesday 19 August 2025 14:26:48 -0400 (0:00:00.262) 0:10:37.027 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Tuesday 19 August 2025 14:26:49 -0400 (0:00:00.208) 0:10:37.235 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Tuesday 19 August 2025 14:26:49 -0400 (0:00:00.277) 0:10:37.513 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Tuesday 19 August 2025 14:26:49 -0400 (0:00:00.213) 0:10:37.727 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Tuesday 19 August 2025 14:26:49 -0400 (0:00:00.277) 0:10:38.004 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Tuesday 19 August 2025 14:26:50 -0400 (0:00:00.221) 0:10:38.226 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Tuesday 19 August 2025 14:26:50 -0400 (0:00:00.239) 0:10:38.465 ******** TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Tuesday 19 August 2025 14:26:50 -0400 (0:00:00.202) 0:10:38.668 ******** ok: [managed-node11] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.43.133 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Tuesday 19 August 2025 14:26:52 -0400 (0:00:01.492) 0:10:40.161 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Tuesday 19 August 2025 14:26:52 -0400 (0:00:00.169) 0:10:40.330 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node11 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Tuesday 19 August 2025 14:26:52 -0400 (0:00:00.497) 0:10:40.828 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Tuesday 19 August 2025 14:26:52 -0400 (0:00:00.169) 0:10:40.998 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Tuesday 19 August 2025 14:26:53 -0400 (0:00:00.200) 0:10:41.198 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Tuesday 19 August 2025 14:26:53 -0400 (0:00:00.245) 0:10:41.444 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Tuesday 19 August 2025 14:26:53 -0400 (0:00:00.198) 0:10:41.642 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Tuesday 19 August 2025 14:26:53 -0400 (0:00:00.243) 0:10:41.886 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Tuesday 19 August 2025 14:26:53 -0400 (0:00:00.195) 0:10:42.082 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Tuesday 19 August 2025 14:26:54 -0400 (0:00:00.146) 0:10:42.228 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Tuesday 19 August 2025 14:26:54 -0400 (0:00:00.127) 0:10:42.356 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Tuesday 19 August 2025 14:26:54 -0400 (0:00:00.131) 0:10:42.488 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Tuesday 19 August 2025 14:26:54 -0400 (0:00:00.183) 0:10:42.671 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Tuesday 19 August 2025 14:26:54 -0400 (0:00:00.191) 0:10:42.862 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node11 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Tuesday 19 August 2025 14:26:55 -0400 (0:00:00.449) 0:10:43.312 ******** skipping: [managed-node11] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=60d88914-fbf4-4327-a0d6-a90fc64917ca', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=60d88914-fbf4-4327-a0d6-a90fc64917ca", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Tuesday 19 August 2025 14:26:55 -0400 (0:00:00.203) 0:10:43.515 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node11 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Tuesday 19 August 2025 14:26:55 -0400 (0:00:00.397) 0:10:43.913 ******** skipping: [managed-node11] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=60d88914-fbf4-4327-a0d6-a90fc64917ca', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=60d88914-fbf4-4327-a0d6-a90fc64917ca", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Tuesday 19 August 2025 14:26:56 -0400 (0:00:00.623) 0:10:44.537 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node11 TASK [Set test variables] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Tuesday 19 August 2025 14:26:56 -0400 (0:00:00.339) 0:10:44.877 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Tuesday 19 August 2025 14:26:57 -0400 (0:00:00.297) 0:10:45.174 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Tuesday 19 August 2025 14:26:57 -0400 (0:00:00.151) 0:10:45.326 ******** TASK [Clear test variables] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Tuesday 19 August 2025 14:26:57 -0400 (0:00:00.172) 0:10:45.498 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Tuesday 19 August 2025 14:26:57 -0400 (0:00:00.177) 0:10:45.676 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node11 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Tuesday 19 August 2025 14:26:57 -0400 (0:00:00.420) 0:10:46.096 ******** skipping: [managed-node11] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=60d88914-fbf4-4327-a0d6-a90fc64917ca', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=60d88914-fbf4-4327-a0d6-a90fc64917ca", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Tuesday 19 August 2025 14:26:58 -0400 (0:00:00.189) 0:10:46.286 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node11 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Tuesday 19 August 2025 14:26:58 -0400 (0:00:00.414) 0:10:46.701 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Tuesday 19 August 2025 14:26:58 -0400 (0:00:00.210) 0:10:46.911 ******** skipping: [managed-node11] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Tuesday 19 August 2025 14:26:58 -0400 (0:00:00.211) 0:10:47.123 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Tuesday 19 August 2025 14:26:59 -0400 (0:00:00.182) 0:10:47.305 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Tuesday 19 August 2025 14:26:59 -0400 (0:00:00.198) 0:10:47.503 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Tuesday 19 August 2025 14:26:59 -0400 (0:00:00.109) 0:10:47.613 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Tuesday 19 August 2025 14:26:59 -0400 (0:00:00.169) 0:10:47.782 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Tuesday 19 August 2025 14:26:59 -0400 (0:00:00.259) 0:10:48.042 ******** ok: [managed-node11] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Tuesday 19 August 2025 14:27:00 -0400 (0:00:00.229) 0:10:48.271 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node11 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Tuesday 19 August 2025 14:27:00 -0400 (0:00:00.513) 0:10:48.785 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Tuesday 19 August 2025 14:27:00 -0400 (0:00:00.313) 0:10:49.098 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node11 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Tuesday 19 August 2025 14:27:02 -0400 (0:00:01.048) 0:10:50.147 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_device_path": "/dev/sda1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Tuesday 19 August 2025 14:27:02 -0400 (0:00:00.275) 0:10:50.422 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Tuesday 19 August 2025 14:27:02 -0400 (0:00:00.304) 0:10:50.726 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Tuesday 19 August 2025 14:27:02 -0400 (0:00:00.217) 0:10:50.944 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Tuesday 19 August 2025 14:27:03 -0400 (0:00:00.308) 0:10:51.253 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Tuesday 19 August 2025 14:27:03 -0400 (0:00:00.316) 0:10:51.569 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Tuesday 19 August 2025 14:27:03 -0400 (0:00:00.240) 0:10:51.810 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Tuesday 19 August 2025 14:27:03 -0400 (0:00:00.270) 0:10:52.080 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Tuesday 19 August 2025 14:27:04 -0400 (0:00:00.323) 0:10:52.404 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Tuesday 19 August 2025 14:27:04 -0400 (0:00:00.286) 0:10:52.691 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Tuesday 19 August 2025 14:27:04 -0400 (0:00:00.233) 0:10:52.924 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Tuesday 19 August 2025 14:27:04 -0400 (0:00:00.216) 0:10:53.141 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=60d88914-fbf4-4327-a0d6-a90fc64917ca " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Tuesday 19 August 2025 14:27:05 -0400 (0:00:00.506) 0:10:53.648 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Tuesday 19 August 2025 14:27:05 -0400 (0:00:00.243) 0:10:53.892 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Tuesday 19 August 2025 14:27:06 -0400 (0:00:00.268) 0:10:54.161 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Tuesday 19 August 2025 14:27:06 -0400 (0:00:00.775) 0:10:54.936 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Tuesday 19 August 2025 14:27:07 -0400 (0:00:00.279) 0:10:55.216 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Tuesday 19 August 2025 14:27:07 -0400 (0:00:00.189) 0:10:55.406 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Tuesday 19 August 2025 14:27:07 -0400 (0:00:00.294) 0:10:55.701 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Tuesday 19 August 2025 14:27:07 -0400 (0:00:00.189) 0:10:55.890 ******** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1755627980.6306164, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1755627980.6306164, "dev": 6, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 189882, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1755627980.6306164, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Tuesday 19 August 2025 14:27:09 -0400 (0:00:01.312) 0:10:57.203 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Tuesday 19 August 2025 14:27:09 -0400 (0:00:00.306) 0:10:57.509 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Tuesday 19 August 2025 14:27:09 -0400 (0:00:00.197) 0:10:57.706 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Tuesday 19 August 2025 14:27:09 -0400 (0:00:00.286) 0:10:57.993 ******** ok: [managed-node11] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Tuesday 19 August 2025 14:27:10 -0400 (0:00:00.267) 0:10:58.261 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Tuesday 19 August 2025 14:27:10 -0400 (0:00:00.285) 0:10:58.547 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Tuesday 19 August 2025 14:27:10 -0400 (0:00:00.375) 0:10:58.923 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Tuesday 19 August 2025 14:27:11 -0400 (0:00:00.306) 0:10:59.230 ******** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Tuesday 19 August 2025 14:27:15 -0400 (0:00:04.789) 0:11:04.019 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Tuesday 19 August 2025 14:27:16 -0400 (0:00:00.183) 0:11:04.203 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Tuesday 19 August 2025 14:27:16 -0400 (0:00:00.247) 0:11:04.450 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Tuesday 19 August 2025 14:27:16 -0400 (0:00:00.297) 0:11:04.748 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Tuesday 19 August 2025 14:27:16 -0400 (0:00:00.251) 0:11:04.999 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Tuesday 19 August 2025 14:27:16 -0400 (0:00:00.127) 0:11:05.127 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Tuesday 19 August 2025 14:27:17 -0400 (0:00:00.121) 0:11:05.248 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Tuesday 19 August 2025 14:27:17 -0400 (0:00:00.107) 0:11:05.356 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Tuesday 19 August 2025 14:27:17 -0400 (0:00:00.173) 0:11:05.529 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Tuesday 19 August 2025 14:27:17 -0400 (0:00:00.165) 0:11:05.694 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Tuesday 19 August 2025 14:27:17 -0400 (0:00:00.303) 0:11:05.998 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Tuesday 19 August 2025 14:27:18 -0400 (0:00:00.157) 0:11:06.156 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Tuesday 19 August 2025 14:27:18 -0400 (0:00:00.202) 0:11:06.359 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Tuesday 19 August 2025 14:27:18 -0400 (0:00:00.192) 0:11:06.551 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Tuesday 19 August 2025 14:27:18 -0400 (0:00:00.232) 0:11:06.784 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Tuesday 19 August 2025 14:27:18 -0400 (0:00:00.251) 0:11:07.035 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Tuesday 19 August 2025 14:27:19 -0400 (0:00:00.248) 0:11:07.284 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Tuesday 19 August 2025 14:27:19 -0400 (0:00:00.265) 0:11:07.549 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Tuesday 19 August 2025 14:27:19 -0400 (0:00:00.283) 0:11:07.833 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Tuesday 19 August 2025 14:27:19 -0400 (0:00:00.238) 0:11:08.072 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Tuesday 19 August 2025 14:27:20 -0400 (0:00:00.188) 0:11:08.261 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Tuesday 19 August 2025 14:27:20 -0400 (0:00:00.175) 0:11:08.436 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Tuesday 19 August 2025 14:27:20 -0400 (0:00:00.157) 0:11:08.594 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Tuesday 19 August 2025 14:27:20 -0400 (0:00:00.235) 0:11:08.829 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Tuesday 19 August 2025 14:27:20 -0400 (0:00:00.194) 0:11:09.024 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Tuesday 19 August 2025 14:27:21 -0400 (0:00:00.215) 0:11:09.239 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Tuesday 19 August 2025 14:27:21 -0400 (0:00:00.269) 0:11:09.509 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Tuesday 19 August 2025 14:27:21 -0400 (0:00:00.273) 0:11:09.782 ******** ok: [managed-node11] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Tuesday 19 August 2025 14:27:21 -0400 (0:00:00.179) 0:11:09.962 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Tuesday 19 August 2025 14:27:22 -0400 (0:00:00.309) 0:11:10.271 ******** skipping: [managed-node11] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Tuesday 19 August 2025 14:27:22 -0400 (0:00:00.235) 0:11:10.507 ******** skipping: [managed-node11] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Tuesday 19 August 2025 14:27:22 -0400 (0:00:00.191) 0:11:10.699 ******** skipping: [managed-node11] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Tuesday 19 August 2025 14:27:22 -0400 (0:00:00.224) 0:11:10.924 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Tuesday 19 August 2025 14:27:23 -0400 (0:00:00.267) 0:11:11.192 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Tuesday 19 August 2025 14:27:23 -0400 (0:00:00.233) 0:11:11.425 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Tuesday 19 August 2025 14:27:23 -0400 (0:00:00.320) 0:11:11.746 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Tuesday 19 August 2025 14:27:23 -0400 (0:00:00.235) 0:11:11.982 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Tuesday 19 August 2025 14:27:24 -0400 (0:00:00.303) 0:11:12.286 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Tuesday 19 August 2025 14:27:24 -0400 (0:00:00.280) 0:11:12.566 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Tuesday 19 August 2025 14:27:24 -0400 (0:00:00.157) 0:11:12.723 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Tuesday 19 August 2025 14:27:24 -0400 (0:00:00.275) 0:11:12.999 ******** skipping: [managed-node11] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Tuesday 19 August 2025 14:27:25 -0400 (0:00:00.226) 0:11:13.225 ******** skipping: [managed-node11] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Tuesday 19 August 2025 14:27:25 -0400 (0:00:00.247) 0:11:13.473 ******** skipping: [managed-node11] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Tuesday 19 August 2025 14:27:25 -0400 (0:00:00.235) 0:11:13.709 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Tuesday 19 August 2025 14:27:25 -0400 (0:00:00.273) 0:11:13.982 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Tuesday 19 August 2025 14:27:26 -0400 (0:00:00.276) 0:11:14.259 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Tuesday 19 August 2025 14:27:26 -0400 (0:00:00.286) 0:11:14.545 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Tuesday 19 August 2025 14:27:26 -0400 (0:00:00.211) 0:11:14.757 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Tuesday 19 August 2025 14:27:26 -0400 (0:00:00.098) 0:11:14.856 ******** ok: [managed-node11] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Tuesday 19 August 2025 14:27:26 -0400 (0:00:00.229) 0:11:15.085 ******** ok: [managed-node11] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Tuesday 19 August 2025 14:27:27 -0400 (0:00:00.203) 0:11:15.288 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Tuesday 19 August 2025 14:27:27 -0400 (0:00:00.200) 0:11:15.488 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Tuesday 19 August 2025 14:27:27 -0400 (0:00:00.190) 0:11:15.679 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Tuesday 19 August 2025 14:27:27 -0400 (0:00:00.191) 0:11:15.870 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Tuesday 19 August 2025 14:27:27 -0400 (0:00:00.271) 0:11:16.142 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Tuesday 19 August 2025 14:27:28 -0400 (0:00:00.614) 0:11:16.756 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Tuesday 19 August 2025 14:27:28 -0400 (0:00:00.254) 0:11:17.011 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Tuesday 19 August 2025 14:27:29 -0400 (0:00:00.265) 0:11:17.277 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Tuesday 19 August 2025 14:27:29 -0400 (0:00:00.262) 0:11:17.539 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Tuesday 19 August 2025 14:27:29 -0400 (0:00:00.191) 0:11:17.730 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Tuesday 19 August 2025 14:27:29 -0400 (0:00:00.168) 0:11:17.899 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Tuesday 19 August 2025 14:27:29 -0400 (0:00:00.199) 0:11:18.098 ******** changed: [managed-node11] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 4] ****************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:286 Tuesday 19 August 2025 14:27:31 -0400 (0:00:01.552) 0:11:19.651 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node11 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Tuesday 19 August 2025 14:27:32 -0400 (0:00:00.752) 0:11:20.403 ******** ok: [managed-node11] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Tuesday 19 August 2025 14:27:32 -0400 (0:00:00.295) 0:11:20.698 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Tuesday 19 August 2025 14:27:33 -0400 (0:00:00.534) 0:11:21.233 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Tuesday 19 August 2025 14:27:33 -0400 (0:00:00.407) 0:11:21.640 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Tuesday 19 August 2025 14:27:33 -0400 (0:00:00.374) 0:11:22.015 ******** skipping: [managed-node11] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node11] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Tuesday 19 August 2025 14:27:34 -0400 (0:00:00.733) 0:11:22.749 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Tuesday 19 August 2025 14:27:34 -0400 (0:00:00.185) 0:11:22.934 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Tuesday 19 August 2025 14:27:34 -0400 (0:00:00.202) 0:11:23.136 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Tuesday 19 August 2025 14:27:35 -0400 (0:00:00.201) 0:11:23.338 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Tuesday 19 August 2025 14:27:35 -0400 (0:00:00.196) 0:11:23.535 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Tuesday 19 August 2025 14:27:35 -0400 (0:00:00.411) 0:11:23.946 ******** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Tuesday 19 August 2025 14:27:39 -0400 (0:00:03.739) 0:11:27.686 ******** ok: [managed-node11] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Tuesday 19 August 2025 14:27:39 -0400 (0:00:00.337) 0:11:28.024 ******** ok: [managed-node11] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Tuesday 19 August 2025 14:27:40 -0400 (0:00:00.209) 0:11:28.233 ******** ok: [managed-node11] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Tuesday 19 August 2025 14:27:45 -0400 (0:00:05.126) 0:11:33.359 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Tuesday 19 August 2025 14:27:45 -0400 (0:00:00.301) 0:11:33.661 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Tuesday 19 August 2025 14:27:45 -0400 (0:00:00.239) 0:11:33.901 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Tuesday 19 August 2025 14:27:45 -0400 (0:00:00.205) 0:11:34.106 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Tuesday 19 August 2025 14:27:46 -0400 (0:00:00.240) 0:11:34.346 ******** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Tuesday 19 August 2025 14:27:51 -0400 (0:00:04.881) 0:11:39.227 ******** ok: [managed-node11] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.service": { "name": "systemd-cryptsetup@luk...d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2d7d1089ca\\x2d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.service": { "name": "systemd-cryptsetup@luks\\x2d7d1089ca\\x2d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Tuesday 19 August 2025 14:27:54 -0400 (0:00:03.336) 0:11:42.564 ******** ok: [managed-node11] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d7d1089ca\\x2d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.service", "systemd-cryptsetup@luk...d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Tuesday 19 August 2025 14:27:54 -0400 (0:00:00.556) 0:11:43.121 ******** changed: [managed-node11] => (item=systemd-cryptsetup@luks\x2d7d1089ca\x2d0f6b\x2d48fe\x2d9a70\x2de7c81f42081f.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d7d1089ca\\x2d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.service", "name": "systemd-cryptsetup@luks\\x2d7d1089ca\\x2d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "system-systemd\\x2dcryptsetup.slice cryptsetup-pre.target dev-sda1.device systemd-journald.socket", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f /dev/sda1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d7d1089ca\\x2d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d7d1089ca\\x2d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d7d1089ca\\x2d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Tue 2025-08-19 14:25:37 EDT", "StateChangeTimestampMonotonic": "2254836319", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node11] => (item=systemd-cryptsetup@luk...d0f6b\x2d48fe\x2d9a70\x2de7c81f42081f.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.service", "name": "systemd-cryptsetup@luk...d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Tuesday 19 August 2025 14:27:58 -0400 (0:00:03.711) 0:11:46.833 ******** fatal: [managed-node11]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Tuesday 19 August 2025 14:28:04 -0400 (0:00:05.559) 0:11:52.392 ******** fatal: [managed-node11]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Tuesday 19 August 2025 14:28:04 -0400 (0:00:00.354) 0:11:52.746 ******** changed: [managed-node11] => (item=systemd-cryptsetup@luks\x2d7d1089ca\x2d0f6b\x2d48fe\x2d9a70\x2de7c81f42081f.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d7d1089ca\\x2d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.service", "name": "systemd-cryptsetup@luks\\x2d7d1089ca\\x2d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d7d1089ca\\x2d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d7d1089ca\\x2d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d7d1089ca\\x2d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d7d1089ca\\x2d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node11] => (item=systemd-cryptsetup@luk...d0f6b\x2d48fe\x2d9a70\x2de7c81f42081f.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.service", "name": "systemd-cryptsetup@luk...d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d0f6b\\x2d48fe\\x2d9a70\\x2de7c81f42081f.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Tuesday 19 August 2025 14:28:08 -0400 (0:00:03.563) 0:11:56.310 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Tuesday 19 August 2025 14:28:08 -0400 (0:00:00.277) 0:11:56.588 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Tuesday 19 August 2025 14:28:08 -0400 (0:00:00.289) 0:11:56.877 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Tuesday 19 August 2025 14:28:09 -0400 (0:00:00.273) 0:11:57.151 ******** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1755628051.2540238, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1755628051.2540238, "dev": 2049, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1755628051.2540238, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "785609312", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Tuesday 19 August 2025 14:28:10 -0400 (0:00:01.528) 0:11:58.679 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Create a key file] ******************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:312 Tuesday 19 August 2025 14:28:10 -0400 (0:00:00.250) 0:11:58.929 ******** ok: [managed-node11] => { "changed": false, "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/tmp/storage_testkbs2wr22lukskey", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Write the key into the key file] ***************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:319 Tuesday 19 August 2025 14:28:13 -0400 (0:00:02.576) 0:12:01.506 ******** ok: [managed-node11] => { "changed": false, "checksum": "7a4dff3752e2baf5617c57eaac048e2b95e8af91", "dest": "/tmp/storage_testkbs2wr22lukskey", "gid": 0, "group": "root", "md5sum": "4ac07b967150835c00d0865161e48744", "mode": "0600", "owner": "root", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 32, "src": "/root/.ansible/tmp/ansible-tmp-1755628093.6819057-176838-109722165468910/source", "state": "file", "uid": 0 } TASK [Add encryption to the volume - 2] **************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:326 Tuesday 19 August 2025 14:28:17 -0400 (0:00:04.255) 0:12:05.762 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Tuesday 19 August 2025 14:28:18 -0400 (0:00:00.446) 0:12:06.208 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Tuesday 19 August 2025 14:28:18 -0400 (0:00:00.352) 0:12:06.561 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Tuesday 19 August 2025 14:28:18 -0400 (0:00:00.321) 0:12:06.882 ******** skipping: [managed-node11] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node11] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Tuesday 19 August 2025 14:28:19 -0400 (0:00:00.682) 0:12:07.565 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Tuesday 19 August 2025 14:28:19 -0400 (0:00:00.358) 0:12:07.924 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Tuesday 19 August 2025 14:28:20 -0400 (0:00:00.278) 0:12:08.202 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Tuesday 19 August 2025 14:28:20 -0400 (0:00:00.229) 0:12:08.432 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Tuesday 19 August 2025 14:28:20 -0400 (0:00:00.199) 0:12:08.632 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Tuesday 19 August 2025 14:28:21 -0400 (0:00:00.707) 0:12:09.339 ******** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Tuesday 19 August 2025 14:28:26 -0400 (0:00:04.941) 0:12:14.280 ******** ok: [managed-node11] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_key": "/tmp/storage_testkbs2wr22lukskey", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Tuesday 19 August 2025 14:28:26 -0400 (0:00:00.183) 0:12:14.463 ******** ok: [managed-node11] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Tuesday 19 August 2025 14:28:26 -0400 (0:00:00.099) 0:12:14.563 ******** ok: [managed-node11] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Tuesday 19 August 2025 14:28:31 -0400 (0:00:05.258) 0:12:19.822 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Tuesday 19 August 2025 14:28:32 -0400 (0:00:00.811) 0:12:20.634 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Tuesday 19 August 2025 14:28:32 -0400 (0:00:00.239) 0:12:20.873 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Tuesday 19 August 2025 14:28:33 -0400 (0:00:00.307) 0:12:21.181 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Tuesday 19 August 2025 14:28:33 -0400 (0:00:00.198) 0:12:21.379 ******** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Tuesday 19 August 2025 14:28:37 -0400 (0:00:04.616) 0:12:25.996 ******** ok: [managed-node11] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Tuesday 19 August 2025 14:28:40 -0400 (0:00:02.747) 0:12:28.744 ******** ok: [managed-node11] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Tuesday 19 August 2025 14:28:40 -0400 (0:00:00.179) 0:12:28.923 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Tuesday 19 August 2025 14:28:40 -0400 (0:00:00.195) 0:12:29.118 ******** changed: [managed-node11] => { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-ac72a752-0f7b-4918-a9fe-b8223790efc2", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-ac72a752-0f7b-4918-a9fe-b8223790efc2", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-ac72a752-0f7b-4918-a9fe-b8223790efc2", "password": "/tmp/storage_testkbs2wr22lukskey", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-ac72a752-0f7b-4918-a9fe-b8223790efc2" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=60d88914-fbf4-4327-a0d6-a90fc64917ca", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-ac72a752-0f7b-4918-a9fe-b8223790efc2", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-ac72a752-0f7b-4918-a9fe-b8223790efc2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-ac72a752-0f7b-4918-a9fe-b8223790efc2", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testkbs2wr22lukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Tuesday 19 August 2025 14:28:54 -0400 (0:00:13.702) 0:12:42.821 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Tuesday 19 August 2025 14:28:54 -0400 (0:00:00.176) 0:12:42.998 ******** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1755627993.5806916, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "3ebb6741e76ba1ac61d9e5a2a0fd865d408193fa", "ctime": 1755627993.5786915, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 364904586, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1755627993.5786915, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1413, "uid": 0, "version": "2238694571", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Tuesday 19 August 2025 14:28:56 -0400 (0:00:01.500) 0:12:44.498 ******** ok: [managed-node11] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Tuesday 19 August 2025 14:28:58 -0400 (0:00:01.653) 0:12:46.152 ******** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Tuesday 19 August 2025 14:28:58 -0400 (0:00:00.216) 0:12:46.368 ******** ok: [managed-node11] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-ac72a752-0f7b-4918-a9fe-b8223790efc2", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-ac72a752-0f7b-4918-a9fe-b8223790efc2", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-ac72a752-0f7b-4918-a9fe-b8223790efc2", "password": "/tmp/storage_testkbs2wr22lukskey", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-ac72a752-0f7b-4918-a9fe-b8223790efc2" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=60d88914-fbf4-4327-a0d6-a90fc64917ca", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-ac72a752-0f7b-4918-a9fe-b8223790efc2", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-ac72a752-0f7b-4918-a9fe-b8223790efc2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-ac72a752-0f7b-4918-a9fe-b8223790efc2", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testkbs2wr22lukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Tuesday 19 August 2025 14:28:58 -0400 (0:00:00.213) 0:12:46.581 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-ac72a752-0f7b-4918-a9fe-b8223790efc2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-ac72a752-0f7b-4918-a9fe-b8223790efc2", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testkbs2wr22lukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Tuesday 19 August 2025 14:28:58 -0400 (0:00:00.269) 0:12:46.851 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Tuesday 19 August 2025 14:28:58 -0400 (0:00:00.250) 0:12:47.101 ******** changed: [managed-node11] => (item={'src': 'UUID=60d88914-fbf4-4327-a0d6-a90fc64917ca', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=60d88914-fbf4-4327-a0d6-a90fc64917ca", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=60d88914-fbf4-4327-a0d6-a90fc64917ca" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Tuesday 19 August 2025 14:29:00 -0400 (0:00:01.746) 0:12:48.848 ******** ok: [managed-node11] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Tuesday 19 August 2025 14:29:02 -0400 (0:00:01.821) 0:12:50.669 ******** changed: [managed-node11] => (item={'src': '/dev/mapper/luks-ac72a752-0f7b-4918-a9fe-b8223790efc2', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-ac72a752-0f7b-4918-a9fe-b8223790efc2", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-ac72a752-0f7b-4918-a9fe-b8223790efc2" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Tuesday 19 August 2025 14:29:04 -0400 (0:00:01.600) 0:12:52.269 ******** skipping: [managed-node11] => (item={'src': '/dev/mapper/luks-ac72a752-0f7b-4918-a9fe-b8223790efc2', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-ac72a752-0f7b-4918-a9fe-b8223790efc2", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Tuesday 19 August 2025 14:29:04 -0400 (0:00:00.294) 0:12:52.564 ******** ok: [managed-node11] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Tuesday 19 August 2025 14:29:06 -0400 (0:00:01.941) 0:12:54.505 ******** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1755628006.4367661, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1755627998.9327226, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 249561285, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1755627998.9307225, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "3148966492", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Tuesday 19 August 2025 14:29:07 -0400 (0:00:01.492) 0:12:55.998 ******** changed: [managed-node11] => (item={'backing_device': '/dev/sda1', 'name': 'luks-ac72a752-0f7b-4918-a9fe-b8223790efc2', 'password': '/tmp/storage_testkbs2wr22lukskey', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-ac72a752-0f7b-4918-a9fe-b8223790efc2", "password": "/tmp/storage_testkbs2wr22lukskey", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Tuesday 19 August 2025 14:29:09 -0400 (0:00:01.398) 0:12:57.396 ******** ok: [managed-node11] TASK [Verify role results - 6] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:343 Tuesday 19 August 2025 14:29:11 -0400 (0:00:01.901) 0:12:59.298 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node11 TASK [Print out pool information] ********************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Tuesday 19 August 2025 14:29:11 -0400 (0:00:00.487) 0:12:59.785 ******** ok: [managed-node11] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-ac72a752-0f7b-4918-a9fe-b8223790efc2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-ac72a752-0f7b-4918-a9fe-b8223790efc2", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testkbs2wr22lukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Tuesday 19 August 2025 14:29:11 -0400 (0:00:00.281) 0:13:00.066 ******** skipping: [managed-node11] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Tuesday 19 August 2025 14:29:12 -0400 (0:00:00.334) 0:13:00.401 ******** ok: [managed-node11] => { "changed": false, "info": { "/dev/mapper/luks-ac72a752-0f7b-4918-a9fe-b8223790efc2": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-ac72a752-0f7b-4918-a9fe-b8223790efc2", "size": "4G", "type": "crypt", "uuid": "12d9309f-f418-4f94-a764-b2484b9e2167" }, "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda1", "size": "4G", "type": "partition", "uuid": "ac72a752-0f7b-4918-a9fe-b8223790efc2" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Tuesday 19 August 2025 14:29:13 -0400 (0:00:01.628) 0:13:02.029 ******** ok: [managed-node11] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002220", "end": "2025-08-19 14:29:15.164021", "rc": 0, "start": "2025-08-19 14:29:15.161801" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-ac72a752-0f7b-4918-a9fe-b8223790efc2 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Tuesday 19 August 2025 14:29:15 -0400 (0:00:01.545) 0:13:03.574 ******** ok: [managed-node11] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002590", "end": "2025-08-19 14:29:16.504235", "failed_when_result": false, "rc": 0, "start": "2025-08-19 14:29:16.501645" } STDOUT: luks-ac72a752-0f7b-4918-a9fe-b8223790efc2 /dev/sda1 /tmp/storage_testkbs2wr22lukskey TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Tuesday 19 August 2025 14:29:16 -0400 (0:00:01.313) 0:13:04.887 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node11 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Tuesday 19 August 2025 14:29:17 -0400 (0:00:00.508) 0:13:05.396 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Tuesday 19 August 2025 14:29:17 -0400 (0:00:00.210) 0:13:05.606 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Tuesday 19 August 2025 14:29:17 -0400 (0:00:00.260) 0:13:05.867 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Tuesday 19 August 2025 14:29:18 -0400 (0:00:00.315) 0:13:06.183 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node11 TASK [Set test variables] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Tuesday 19 August 2025 14:29:18 -0400 (0:00:00.573) 0:13:06.757 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Tuesday 19 August 2025 14:29:18 -0400 (0:00:00.215) 0:13:06.973 ******** TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Tuesday 19 August 2025 14:29:18 -0400 (0:00:00.166) 0:13:07.139 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Tuesday 19 August 2025 14:29:19 -0400 (0:00:00.262) 0:13:07.402 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Tuesday 19 August 2025 14:29:19 -0400 (0:00:00.188) 0:13:07.590 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Tuesday 19 August 2025 14:29:19 -0400 (0:00:00.222) 0:13:07.812 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Tuesday 19 August 2025 14:29:19 -0400 (0:00:00.255) 0:13:08.068 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Tuesday 19 August 2025 14:29:20 -0400 (0:00:00.196) 0:13:08.264 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Tuesday 19 August 2025 14:29:20 -0400 (0:00:00.231) 0:13:08.496 ******** TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Tuesday 19 August 2025 14:29:20 -0400 (0:00:00.229) 0:13:08.725 ******** ok: [managed-node11] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.43.133 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Tuesday 19 August 2025 14:29:22 -0400 (0:00:01.643) 0:13:10.369 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Tuesday 19 August 2025 14:29:22 -0400 (0:00:00.190) 0:13:10.560 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node11 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Tuesday 19 August 2025 14:29:22 -0400 (0:00:00.323) 0:13:10.883 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Tuesday 19 August 2025 14:29:22 -0400 (0:00:00.247) 0:13:11.130 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Tuesday 19 August 2025 14:29:23 -0400 (0:00:00.253) 0:13:11.384 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Tuesday 19 August 2025 14:29:23 -0400 (0:00:00.268) 0:13:11.652 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Tuesday 19 August 2025 14:29:23 -0400 (0:00:00.219) 0:13:11.871 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Tuesday 19 August 2025 14:29:23 -0400 (0:00:00.207) 0:13:12.078 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Tuesday 19 August 2025 14:29:24 -0400 (0:00:00.258) 0:13:12.337 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Tuesday 19 August 2025 14:29:24 -0400 (0:00:00.171) 0:13:12.508 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Tuesday 19 August 2025 14:29:24 -0400 (0:00:00.214) 0:13:12.723 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Tuesday 19 August 2025 14:29:24 -0400 (0:00:00.205) 0:13:12.929 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Tuesday 19 August 2025 14:29:24 -0400 (0:00:00.108) 0:13:13.037 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Tuesday 19 August 2025 14:29:25 -0400 (0:00:00.289) 0:13:13.327 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node11 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Tuesday 19 August 2025 14:29:25 -0400 (0:00:00.397) 0:13:13.725 ******** skipping: [managed-node11] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_testkbs2wr22lukskey', 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-ac72a752-0f7b-4918-a9fe-b8223790efc2', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-ac72a752-0f7b-4918-a9fe-b8223790efc2', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/mapper/luks-ac72a752-0f7b-4918-a9fe-b8223790efc2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-ac72a752-0f7b-4918-a9fe-b8223790efc2", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testkbs2wr22lukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Tuesday 19 August 2025 14:29:25 -0400 (0:00:00.319) 0:13:14.044 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node11 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Tuesday 19 August 2025 14:29:26 -0400 (0:00:00.475) 0:13:14.519 ******** skipping: [managed-node11] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_testkbs2wr22lukskey', 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-ac72a752-0f7b-4918-a9fe-b8223790efc2', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-ac72a752-0f7b-4918-a9fe-b8223790efc2', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/mapper/luks-ac72a752-0f7b-4918-a9fe-b8223790efc2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-ac72a752-0f7b-4918-a9fe-b8223790efc2", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testkbs2wr22lukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Tuesday 19 August 2025 14:29:26 -0400 (0:00:00.397) 0:13:14.917 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node11 TASK [Set test variables] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Tuesday 19 August 2025 14:29:27 -0400 (0:00:00.595) 0:13:15.512 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Tuesday 19 August 2025 14:29:27 -0400 (0:00:00.277) 0:13:15.790 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Tuesday 19 August 2025 14:29:28 -0400 (0:00:00.557) 0:13:16.348 ******** TASK [Clear test variables] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Tuesday 19 August 2025 14:29:28 -0400 (0:00:00.186) 0:13:16.534 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Tuesday 19 August 2025 14:29:28 -0400 (0:00:00.311) 0:13:16.845 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node11 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Tuesday 19 August 2025 14:29:29 -0400 (0:00:00.634) 0:13:17.480 ******** skipping: [managed-node11] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_testkbs2wr22lukskey', 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-ac72a752-0f7b-4918-a9fe-b8223790efc2', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-ac72a752-0f7b-4918-a9fe-b8223790efc2', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/mapper/luks-ac72a752-0f7b-4918-a9fe-b8223790efc2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-ac72a752-0f7b-4918-a9fe-b8223790efc2", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testkbs2wr22lukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Tuesday 19 August 2025 14:29:29 -0400 (0:00:00.378) 0:13:17.858 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node11 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Tuesday 19 August 2025 14:29:30 -0400 (0:00:00.684) 0:13:18.543 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Tuesday 19 August 2025 14:29:30 -0400 (0:00:00.293) 0:13:18.837 ******** skipping: [managed-node11] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Tuesday 19 August 2025 14:29:31 -0400 (0:00:00.329) 0:13:19.167 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Tuesday 19 August 2025 14:29:31 -0400 (0:00:00.301) 0:13:19.469 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Tuesday 19 August 2025 14:29:31 -0400 (0:00:00.228) 0:13:19.697 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Tuesday 19 August 2025 14:29:31 -0400 (0:00:00.226) 0:13:19.924 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Tuesday 19 August 2025 14:29:32 -0400 (0:00:00.225) 0:13:20.149 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Tuesday 19 August 2025 14:29:32 -0400 (0:00:00.233) 0:13:20.382 ******** ok: [managed-node11] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Tuesday 19 August 2025 14:29:32 -0400 (0:00:00.130) 0:13:20.513 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node11 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Tuesday 19 August 2025 14:29:32 -0400 (0:00:00.337) 0:13:20.850 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Tuesday 19 August 2025 14:29:32 -0400 (0:00:00.210) 0:13:21.061 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node11 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Tuesday 19 August 2025 14:29:34 -0400 (0:00:01.136) 0:13:22.197 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-ac72a752-0f7b-4918-a9fe-b8223790efc2" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Tuesday 19 August 2025 14:29:34 -0400 (0:00:00.245) 0:13:22.443 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Tuesday 19 August 2025 14:29:34 -0400 (0:00:00.166) 0:13:22.609 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Tuesday 19 August 2025 14:29:34 -0400 (0:00:00.252) 0:13:22.862 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Tuesday 19 August 2025 14:29:35 -0400 (0:00:00.329) 0:13:23.192 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Tuesday 19 August 2025 14:29:35 -0400 (0:00:00.299) 0:13:23.491 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Tuesday 19 August 2025 14:29:35 -0400 (0:00:00.225) 0:13:23.716 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Tuesday 19 August 2025 14:29:35 -0400 (0:00:00.289) 0:13:24.006 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Tuesday 19 August 2025 14:29:36 -0400 (0:00:00.273) 0:13:24.279 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Tuesday 19 August 2025 14:29:36 -0400 (0:00:00.282) 0:13:24.562 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Tuesday 19 August 2025 14:29:36 -0400 (0:00:00.244) 0:13:24.806 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Tuesday 19 August 2025 14:29:36 -0400 (0:00:00.178) 0:13:24.984 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-ac72a752-0f7b-4918-a9fe-b8223790efc2 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Tuesday 19 August 2025 14:29:37 -0400 (0:00:00.541) 0:13:25.526 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Tuesday 19 August 2025 14:29:37 -0400 (0:00:00.318) 0:13:25.844 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Tuesday 19 August 2025 14:29:37 -0400 (0:00:00.256) 0:13:26.100 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Tuesday 19 August 2025 14:29:38 -0400 (0:00:00.185) 0:13:26.286 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Tuesday 19 August 2025 14:29:38 -0400 (0:00:00.249) 0:13:26.536 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Tuesday 19 August 2025 14:29:38 -0400 (0:00:00.255) 0:13:26.791 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Tuesday 19 August 2025 14:29:38 -0400 (0:00:00.294) 0:13:27.086 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Tuesday 19 August 2025 14:29:39 -0400 (0:00:00.215) 0:13:27.301 ******** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1755628134.135492, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1755628134.135492, "dev": 6, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 220918, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1755628134.135492, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Tuesday 19 August 2025 14:29:40 -0400 (0:00:01.567) 0:13:28.869 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Tuesday 19 August 2025 14:29:41 -0400 (0:00:00.282) 0:13:29.151 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Tuesday 19 August 2025 14:29:41 -0400 (0:00:00.264) 0:13:29.416 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Tuesday 19 August 2025 14:29:41 -0400 (0:00:00.213) 0:13:29.629 ******** ok: [managed-node11] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Tuesday 19 August 2025 14:29:41 -0400 (0:00:00.302) 0:13:29.931 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Tuesday 19 August 2025 14:29:42 -0400 (0:00:00.218) 0:13:30.150 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Tuesday 19 August 2025 14:29:42 -0400 (0:00:00.231) 0:13:30.381 ******** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1755628134.289493, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1755628134.289493, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 221012, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1755628134.289493, "nlink": 1, "path": "/dev/mapper/luks-ac72a752-0f7b-4918-a9fe-b8223790efc2", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Tuesday 19 August 2025 14:29:44 -0400 (0:00:01.792) 0:13:32.173 ******** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Tuesday 19 August 2025 14:29:48 -0400 (0:00:04.837) 0:13:37.010 ******** ok: [managed-node11] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda1" ], "delta": "0:00:00.010272", "end": "2025-08-19 14:29:50.076695", "rc": 0, "start": "2025-08-19 14:29:50.066423" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: ac72a752-0f7b-4918-a9fe-b8223790efc2 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 941171 Threads: 2 Salt: f2 3d aa 35 89 db a5 cb a4 45 e7 40 21 ac 69 f5 15 ea 65 10 a2 94 33 6c 6b 76 dd 39 27 59 3c 6e AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 120470 Salt: 9b 81 e3 06 bd 29 ea 6d 72 b8 e8 5f e5 55 9a f2 6e 0c b8 8c 13 27 f5 8e b4 b9 8e 26 56 5e b1 27 Digest: f5 f9 75 b3 fe 7e a3 c1 74 2a a8 f2 81 8d f4 32 04 8b bb f7 98 a7 c8 c0 19 fe fb ea 5c 77 b6 7c TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Tuesday 19 August 2025 14:29:50 -0400 (0:00:01.464) 0:13:38.475 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Tuesday 19 August 2025 14:29:50 -0400 (0:00:00.347) 0:13:38.823 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Tuesday 19 August 2025 14:29:50 -0400 (0:00:00.273) 0:13:39.096 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Tuesday 19 August 2025 14:29:51 -0400 (0:00:00.194) 0:13:39.291 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Tuesday 19 August 2025 14:29:51 -0400 (0:00:00.307) 0:13:39.599 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Tuesday 19 August 2025 14:29:51 -0400 (0:00:00.381) 0:13:39.981 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Tuesday 19 August 2025 14:29:52 -0400 (0:00:00.292) 0:13:40.273 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Tuesday 19 August 2025 14:29:52 -0400 (0:00:00.274) 0:13:40.548 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-ac72a752-0f7b-4918-a9fe-b8223790efc2 /dev/sda1 /tmp/storage_testkbs2wr22lukskey" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "/tmp/storage_testkbs2wr22lukskey" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Tuesday 19 August 2025 14:29:52 -0400 (0:00:00.348) 0:13:40.897 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Tuesday 19 August 2025 14:29:53 -0400 (0:00:00.370) 0:13:41.267 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Tuesday 19 August 2025 14:29:53 -0400 (0:00:00.355) 0:13:41.623 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Tuesday 19 August 2025 14:29:53 -0400 (0:00:00.247) 0:13:41.870 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Tuesday 19 August 2025 14:29:53 -0400 (0:00:00.264) 0:13:42.135 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Tuesday 19 August 2025 14:29:54 -0400 (0:00:00.191) 0:13:42.326 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Tuesday 19 August 2025 14:29:54 -0400 (0:00:00.261) 0:13:42.588 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Tuesday 19 August 2025 14:29:54 -0400 (0:00:00.238) 0:13:42.827 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Tuesday 19 August 2025 14:29:54 -0400 (0:00:00.194) 0:13:43.021 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Tuesday 19 August 2025 14:29:55 -0400 (0:00:00.252) 0:13:43.273 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Tuesday 19 August 2025 14:29:55 -0400 (0:00:00.287) 0:13:43.561 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Tuesday 19 August 2025 14:29:55 -0400 (0:00:00.148) 0:13:43.709 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Tuesday 19 August 2025 14:29:55 -0400 (0:00:00.277) 0:13:43.987 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Tuesday 19 August 2025 14:29:56 -0400 (0:00:00.217) 0:13:44.205 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Tuesday 19 August 2025 14:29:56 -0400 (0:00:00.235) 0:13:44.441 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Tuesday 19 August 2025 14:29:56 -0400 (0:00:00.204) 0:13:44.645 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Tuesday 19 August 2025 14:29:56 -0400 (0:00:00.162) 0:13:44.808 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Tuesday 19 August 2025 14:29:56 -0400 (0:00:00.189) 0:13:44.997 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Tuesday 19 August 2025 14:29:57 -0400 (0:00:00.227) 0:13:45.224 ******** ok: [managed-node11] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Tuesday 19 August 2025 14:29:57 -0400 (0:00:00.202) 0:13:45.427 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Tuesday 19 August 2025 14:29:57 -0400 (0:00:00.171) 0:13:45.598 ******** skipping: [managed-node11] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Tuesday 19 August 2025 14:29:57 -0400 (0:00:00.158) 0:13:45.757 ******** skipping: [managed-node11] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Tuesday 19 August 2025 14:29:57 -0400 (0:00:00.172) 0:13:45.929 ******** skipping: [managed-node11] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Tuesday 19 August 2025 14:29:57 -0400 (0:00:00.098) 0:13:46.028 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Tuesday 19 August 2025 14:29:58 -0400 (0:00:00.266) 0:13:46.294 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Tuesday 19 August 2025 14:29:58 -0400 (0:00:00.234) 0:13:46.529 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Tuesday 19 August 2025 14:29:58 -0400 (0:00:00.285) 0:13:46.815 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Tuesday 19 August 2025 14:29:58 -0400 (0:00:00.276) 0:13:47.091 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Tuesday 19 August 2025 14:29:59 -0400 (0:00:00.340) 0:13:47.432 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Tuesday 19 August 2025 14:29:59 -0400 (0:00:00.232) 0:13:47.664 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Tuesday 19 August 2025 14:29:59 -0400 (0:00:00.170) 0:13:47.835 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Tuesday 19 August 2025 14:29:59 -0400 (0:00:00.259) 0:13:48.094 ******** skipping: [managed-node11] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Tuesday 19 August 2025 14:30:00 -0400 (0:00:00.231) 0:13:48.326 ******** skipping: [managed-node11] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Tuesday 19 August 2025 14:30:00 -0400 (0:00:00.213) 0:13:48.539 ******** skipping: [managed-node11] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Tuesday 19 August 2025 14:30:00 -0400 (0:00:00.246) 0:13:48.785 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Tuesday 19 August 2025 14:30:00 -0400 (0:00:00.174) 0:13:48.960 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Tuesday 19 August 2025 14:30:01 -0400 (0:00:00.210) 0:13:49.170 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Tuesday 19 August 2025 14:30:01 -0400 (0:00:00.216) 0:13:49.387 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Tuesday 19 August 2025 14:30:01 -0400 (0:00:00.261) 0:13:49.648 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Tuesday 19 August 2025 14:30:01 -0400 (0:00:00.236) 0:13:49.884 ******** ok: [managed-node11] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Tuesday 19 August 2025 14:30:01 -0400 (0:00:00.207) 0:13:50.092 ******** ok: [managed-node11] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Tuesday 19 August 2025 14:30:02 -0400 (0:00:00.301) 0:13:50.393 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Tuesday 19 August 2025 14:30:02 -0400 (0:00:00.258) 0:13:50.651 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Tuesday 19 August 2025 14:30:02 -0400 (0:00:00.246) 0:13:50.898 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Tuesday 19 August 2025 14:30:02 -0400 (0:00:00.242) 0:13:51.140 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Tuesday 19 August 2025 14:30:03 -0400 (0:00:00.189) 0:13:51.330 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Tuesday 19 August 2025 14:30:03 -0400 (0:00:00.716) 0:13:52.046 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Tuesday 19 August 2025 14:30:04 -0400 (0:00:00.223) 0:13:52.270 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Tuesday 19 August 2025 14:30:04 -0400 (0:00:00.265) 0:13:52.536 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Tuesday 19 August 2025 14:30:04 -0400 (0:00:00.302) 0:13:52.838 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Tuesday 19 August 2025 14:30:04 -0400 (0:00:00.158) 0:13:52.997 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Tuesday 19 August 2025 14:30:05 -0400 (0:00:00.189) 0:13:53.187 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Remove the key file] ***************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:349 Tuesday 19 August 2025 14:30:05 -0400 (0:00:00.185) 0:13:53.373 ******** ok: [managed-node11] => { "changed": false, "path": "/tmp/storage_testkbs2wr22lukskey", "state": "absent" } TASK [Test for correct handling of new encrypted volume w/ no key - 3] ********* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:359 Tuesday 19 August 2025 14:30:06 -0400 (0:00:01.602) 0:13:54.975 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node11 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Tuesday 19 August 2025 14:30:07 -0400 (0:00:00.283) 0:13:55.258 ******** ok: [managed-node11] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Tuesday 19 August 2025 14:30:07 -0400 (0:00:00.243) 0:13:55.502 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Tuesday 19 August 2025 14:30:07 -0400 (0:00:00.379) 0:13:55.881 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Tuesday 19 August 2025 14:30:08 -0400 (0:00:00.461) 0:13:56.342 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Tuesday 19 August 2025 14:30:08 -0400 (0:00:00.383) 0:13:56.726 ******** skipping: [managed-node11] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node11] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Tuesday 19 August 2025 14:30:09 -0400 (0:00:00.539) 0:13:57.265 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Tuesday 19 August 2025 14:30:09 -0400 (0:00:00.351) 0:13:57.617 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Tuesday 19 August 2025 14:30:09 -0400 (0:00:00.362) 0:13:57.980 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Tuesday 19 August 2025 14:30:10 -0400 (0:00:00.272) 0:13:58.253 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Tuesday 19 August 2025 14:30:10 -0400 (0:00:00.347) 0:13:58.600 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Tuesday 19 August 2025 14:30:11 -0400 (0:00:00.661) 0:13:59.261 ******** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Tuesday 19 August 2025 14:30:16 -0400 (0:00:04.910) 0:14:04.172 ******** ok: [managed-node11] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Tuesday 19 August 2025 14:30:16 -0400 (0:00:00.355) 0:14:04.527 ******** ok: [managed-node11] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Tuesday 19 August 2025 14:30:16 -0400 (0:00:00.293) 0:14:04.821 ******** ok: [managed-node11] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Tuesday 19 August 2025 14:30:22 -0400 (0:00:05.532) 0:14:10.354 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Tuesday 19 August 2025 14:30:22 -0400 (0:00:00.351) 0:14:10.706 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Tuesday 19 August 2025 14:30:22 -0400 (0:00:00.277) 0:14:10.984 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Tuesday 19 August 2025 14:30:23 -0400 (0:00:00.318) 0:14:11.302 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Tuesday 19 August 2025 14:30:23 -0400 (0:00:00.178) 0:14:11.481 ******** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Tuesday 19 August 2025 14:30:27 -0400 (0:00:04.538) 0:14:16.019 ******** ok: [managed-node11] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Tuesday 19 August 2025 14:30:30 -0400 (0:00:02.673) 0:14:18.693 ******** ok: [managed-node11] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Tuesday 19 August 2025 14:30:30 -0400 (0:00:00.394) 0:14:19.087 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Tuesday 19 August 2025 14:30:31 -0400 (0:00:00.185) 0:14:19.272 ******** fatal: [managed-node11]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'test1' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Tuesday 19 August 2025 14:30:36 -0400 (0:00:05.435) 0:14:24.708 ******** fatal: [managed-node11]: FAILED! => { "changed": false } MSG: {'msg': "encrypted volume 'test1' missing key/password", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': False, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Tuesday 19 August 2025 14:30:36 -0400 (0:00:00.179) 0:14:24.887 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Tuesday 19 August 2025 14:30:36 -0400 (0:00:00.131) 0:14:25.019 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Tuesday 19 August 2025 14:30:37 -0400 (0:00:00.193) 0:14:25.213 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Tuesday 19 August 2025 14:30:37 -0400 (0:00:00.349) 0:14:25.562 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create an encrypted lvm volume w/ default fs] **************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:377 Tuesday 19 August 2025 14:30:37 -0400 (0:00:00.202) 0:14:25.765 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Tuesday 19 August 2025 14:30:38 -0400 (0:00:00.525) 0:14:26.291 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Tuesday 19 August 2025 14:30:38 -0400 (0:00:00.384) 0:14:26.675 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Tuesday 19 August 2025 14:30:38 -0400 (0:00:00.301) 0:14:26.976 ******** skipping: [managed-node11] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node11] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Tuesday 19 August 2025 14:30:39 -0400 (0:00:00.602) 0:14:27.579 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Tuesday 19 August 2025 14:30:39 -0400 (0:00:00.335) 0:14:27.915 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Tuesday 19 August 2025 14:30:39 -0400 (0:00:00.177) 0:14:28.093 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Tuesday 19 August 2025 14:30:40 -0400 (0:00:00.329) 0:14:28.422 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Tuesday 19 August 2025 14:30:40 -0400 (0:00:00.357) 0:14:28.780 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Tuesday 19 August 2025 14:30:41 -0400 (0:00:00.643) 0:14:29.423 ******** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Tuesday 19 August 2025 14:30:45 -0400 (0:00:04.417) 0:14:33.841 ******** ok: [managed-node11] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Tuesday 19 August 2025 14:30:45 -0400 (0:00:00.199) 0:14:34.040 ******** ok: [managed-node11] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Tuesday 19 August 2025 14:30:46 -0400 (0:00:00.292) 0:14:34.332 ******** ok: [managed-node11] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Tuesday 19 August 2025 14:30:51 -0400 (0:00:05.413) 0:14:39.746 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Tuesday 19 August 2025 14:30:51 -0400 (0:00:00.331) 0:14:40.077 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Tuesday 19 August 2025 14:30:52 -0400 (0:00:00.243) 0:14:40.320 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Tuesday 19 August 2025 14:30:52 -0400 (0:00:00.199) 0:14:40.520 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Tuesday 19 August 2025 14:30:52 -0400 (0:00:00.132) 0:14:40.652 ******** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Tuesday 19 August 2025 14:30:57 -0400 (0:00:04.957) 0:14:45.609 ******** ok: [managed-node11] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Tuesday 19 August 2025 14:31:00 -0400 (0:00:02.931) 0:14:48.541 ******** ok: [managed-node11] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Tuesday 19 August 2025 14:31:00 -0400 (0:00:00.302) 0:14:48.843 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Tuesday 19 August 2025 14:31:00 -0400 (0:00:00.217) 0:14:49.060 ******** changed: [managed-node11] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-ac72a752-0f7b-4918-a9fe-b8223790efc2", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-ac72a752-0f7b-4918-a9fe-b8223790efc2", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-394e1e52-cec6-40c8-bf74-b0e069960529", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-394e1e52-cec6-40c8-bf74-b0e069960529", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-ac72a752-0f7b-4918-a9fe-b8223790efc2", "password": "-", "state": "absent" }, { "backing_device": "/dev/mapper/foo-test1", "name": "luks-394e1e52-cec6-40c8-bf74-b0e069960529", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-394e1e52-cec6-40c8-bf74-b0e069960529" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-ac72a752-0f7b-4918-a9fe-b8223790efc2", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-394e1e52-cec6-40c8-bf74-b0e069960529", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-394e1e52-cec6-40c8-bf74-b0e069960529", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-394e1e52-cec6-40c8-bf74-b0e069960529", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Tuesday 19 August 2025 14:31:13 -0400 (0:00:12.156) 0:15:01.216 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Tuesday 19 August 2025 14:31:13 -0400 (0:00:00.153) 0:15:01.370 ******** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1755628143.834547, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "0423acca6e7e8ce59b4d08c0967053e6d6ffacd4", "ctime": 1755628143.831547, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 364904586, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1755628143.831547, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "2238694571", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Tuesday 19 August 2025 14:31:14 -0400 (0:00:01.611) 0:15:02.982 ******** ok: [managed-node11] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Tuesday 19 August 2025 14:31:16 -0400 (0:00:01.656) 0:15:04.638 ******** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Tuesday 19 August 2025 14:31:16 -0400 (0:00:00.220) 0:15:04.859 ******** ok: [managed-node11] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-ac72a752-0f7b-4918-a9fe-b8223790efc2", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-ac72a752-0f7b-4918-a9fe-b8223790efc2", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-394e1e52-cec6-40c8-bf74-b0e069960529", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-394e1e52-cec6-40c8-bf74-b0e069960529", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-ac72a752-0f7b-4918-a9fe-b8223790efc2", "password": "-", "state": "absent" }, { "backing_device": "/dev/mapper/foo-test1", "name": "luks-394e1e52-cec6-40c8-bf74-b0e069960529", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-394e1e52-cec6-40c8-bf74-b0e069960529" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-ac72a752-0f7b-4918-a9fe-b8223790efc2", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-394e1e52-cec6-40c8-bf74-b0e069960529", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-394e1e52-cec6-40c8-bf74-b0e069960529", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-394e1e52-cec6-40c8-bf74-b0e069960529", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Tuesday 19 August 2025 14:31:17 -0400 (0:00:00.293) 0:15:05.153 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-394e1e52-cec6-40c8-bf74-b0e069960529", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-394e1e52-cec6-40c8-bf74-b0e069960529", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Tuesday 19 August 2025 14:31:17 -0400 (0:00:00.183) 0:15:05.336 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Tuesday 19 August 2025 14:31:17 -0400 (0:00:00.250) 0:15:05.587 ******** changed: [managed-node11] => (item={'src': '/dev/mapper/luks-ac72a752-0f7b-4918-a9fe-b8223790efc2', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-ac72a752-0f7b-4918-a9fe-b8223790efc2", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-ac72a752-0f7b-4918-a9fe-b8223790efc2" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Tuesday 19 August 2025 14:31:18 -0400 (0:00:01.520) 0:15:07.107 ******** ok: [managed-node11] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Tuesday 19 August 2025 14:31:20 -0400 (0:00:01.855) 0:15:08.963 ******** changed: [managed-node11] => (item={'src': '/dev/mapper/luks-394e1e52-cec6-40c8-bf74-b0e069960529', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-394e1e52-cec6-40c8-bf74-b0e069960529", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-394e1e52-cec6-40c8-bf74-b0e069960529" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Tuesday 19 August 2025 14:31:22 -0400 (0:00:01.678) 0:15:10.641 ******** skipping: [managed-node11] => (item={'src': '/dev/mapper/luks-394e1e52-cec6-40c8-bf74-b0e069960529', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-394e1e52-cec6-40c8-bf74-b0e069960529", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Tuesday 19 August 2025 14:31:22 -0400 (0:00:00.238) 0:15:10.879 ******** ok: [managed-node11] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Tuesday 19 August 2025 14:31:24 -0400 (0:00:01.709) 0:15:12.588 ******** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1755628156.5026186, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "957a29ec4f049a7218931a31bfe7cd346fc3ff15", "ctime": 1755628148.9465759, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 390070468, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1755628148.945576, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 85, "uid": 0, "version": "2158532873", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Tuesday 19 August 2025 14:31:25 -0400 (0:00:01.302) 0:15:13.890 ******** changed: [managed-node11] => (item={'backing_device': '/dev/sda1', 'name': 'luks-ac72a752-0f7b-4918-a9fe-b8223790efc2', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-ac72a752-0f7b-4918-a9fe-b8223790efc2", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed changed: [managed-node11] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-394e1e52-cec6-40c8-bf74-b0e069960529', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-394e1e52-cec6-40c8-bf74-b0e069960529", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Tuesday 19 August 2025 14:31:28 -0400 (0:00:03.023) 0:15:16.914 ******** ok: [managed-node11] TASK [Verify role results - 7] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:396 Tuesday 19 August 2025 14:31:30 -0400 (0:00:02.178) 0:15:19.093 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node11 TASK [Print out pool information] ********************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Tuesday 19 August 2025 14:31:31 -0400 (0:00:00.488) 0:15:19.582 ******** ok: [managed-node11] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-394e1e52-cec6-40c8-bf74-b0e069960529", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-394e1e52-cec6-40c8-bf74-b0e069960529", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Tuesday 19 August 2025 14:31:31 -0400 (0:00:00.299) 0:15:19.881 ******** skipping: [managed-node11] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Tuesday 19 August 2025 14:31:32 -0400 (0:00:00.339) 0:15:20.221 ******** ok: [managed-node11] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "394e1e52-cec6-40c8-bf74-b0e069960529" }, "/dev/mapper/luks-394e1e52-cec6-40c8-bf74-b0e069960529": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-394e1e52-cec6-40c8-bf74-b0e069960529", "size": "4G", "type": "crypt", "uuid": "2755a277-d835-466e-bf9a-e993d94a3060" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "rZPaos-Ceok-3XVM-1kaP-jB95-fFYd-ITZHBi" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Tuesday 19 August 2025 14:31:33 -0400 (0:00:01.497) 0:15:21.718 ******** ok: [managed-node11] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002027", "end": "2025-08-19 14:31:34.664164", "rc": 0, "start": "2025-08-19 14:31:34.662137" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-394e1e52-cec6-40c8-bf74-b0e069960529 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Tuesday 19 August 2025 14:31:34 -0400 (0:00:01.389) 0:15:23.108 ******** ok: [managed-node11] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002258", "end": "2025-08-19 14:31:36.212586", "failed_when_result": false, "rc": 0, "start": "2025-08-19 14:31:36.210328" } STDOUT: luks-394e1e52-cec6-40c8-bf74-b0e069960529 /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Tuesday 19 August 2025 14:31:36 -0400 (0:00:01.589) 0:15:24.698 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node11 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Tuesday 19 August 2025 14:31:37 -0400 (0:00:00.618) 0:15:25.316 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Tuesday 19 August 2025 14:31:37 -0400 (0:00:00.225) 0:15:25.541 ******** ok: [managed-node11] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.022286", "end": "2025-08-19 14:31:38.679571", "rc": 0, "start": "2025-08-19 14:31:38.657285" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Tuesday 19 August 2025 14:31:39 -0400 (0:00:01.605) 0:15:27.147 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Tuesday 19 August 2025 14:31:39 -0400 (0:00:00.261) 0:15:27.408 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node11 TASK [Set test variables] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Tuesday 19 August 2025 14:31:39 -0400 (0:00:00.613) 0:15:28.021 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Tuesday 19 August 2025 14:31:40 -0400 (0:00:00.363) 0:15:28.385 ******** ok: [managed-node11] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Tuesday 19 August 2025 14:31:43 -0400 (0:00:02.860) 0:15:31.245 ******** ok: [managed-node11] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Tuesday 19 August 2025 14:31:43 -0400 (0:00:00.228) 0:15:31.474 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Tuesday 19 August 2025 14:31:43 -0400 (0:00:00.284) 0:15:31.758 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Tuesday 19 August 2025 14:31:43 -0400 (0:00:00.242) 0:15:32.001 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Tuesday 19 August 2025 14:31:44 -0400 (0:00:00.261) 0:15:32.263 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Tuesday 19 August 2025 14:31:44 -0400 (0:00:00.304) 0:15:32.568 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Tuesday 19 August 2025 14:31:44 -0400 (0:00:00.563) 0:15:33.131 ******** ok: [managed-node11] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Tuesday 19 August 2025 14:31:45 -0400 (0:00:00.271) 0:15:33.403 ******** ok: [managed-node11] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.43.133 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Tuesday 19 August 2025 14:31:46 -0400 (0:00:01.624) 0:15:35.027 ******** skipping: [managed-node11] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Tuesday 19 August 2025 14:31:47 -0400 (0:00:00.293) 0:15:35.320 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node11 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Tuesday 19 August 2025 14:31:47 -0400 (0:00:00.418) 0:15:35.739 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Tuesday 19 August 2025 14:31:47 -0400 (0:00:00.240) 0:15:35.980 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Tuesday 19 August 2025 14:31:48 -0400 (0:00:00.256) 0:15:36.236 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Tuesday 19 August 2025 14:31:48 -0400 (0:00:00.312) 0:15:36.549 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Tuesday 19 August 2025 14:31:48 -0400 (0:00:00.337) 0:15:36.886 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Tuesday 19 August 2025 14:31:48 -0400 (0:00:00.241) 0:15:37.127 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Tuesday 19 August 2025 14:31:49 -0400 (0:00:00.276) 0:15:37.403 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Tuesday 19 August 2025 14:31:49 -0400 (0:00:00.261) 0:15:37.665 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Tuesday 19 August 2025 14:31:49 -0400 (0:00:00.304) 0:15:37.970 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Tuesday 19 August 2025 14:31:50 -0400 (0:00:00.261) 0:15:38.231 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Tuesday 19 August 2025 14:31:50 -0400 (0:00:00.368) 0:15:38.600 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Tuesday 19 August 2025 14:31:50 -0400 (0:00:00.250) 0:15:38.850 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node11 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Tuesday 19 August 2025 14:31:51 -0400 (0:00:00.517) 0:15:39.367 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node11 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Tuesday 19 August 2025 14:31:51 -0400 (0:00:00.524) 0:15:39.892 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Tuesday 19 August 2025 14:31:51 -0400 (0:00:00.244) 0:15:40.137 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Tuesday 19 August 2025 14:31:52 -0400 (0:00:00.236) 0:15:40.373 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Tuesday 19 August 2025 14:31:52 -0400 (0:00:00.160) 0:15:40.533 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Tuesday 19 August 2025 14:31:52 -0400 (0:00:00.270) 0:15:40.804 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Tuesday 19 August 2025 14:31:52 -0400 (0:00:00.284) 0:15:41.089 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Tuesday 19 August 2025 14:31:53 -0400 (0:00:00.304) 0:15:41.394 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Tuesday 19 August 2025 14:31:53 -0400 (0:00:00.281) 0:15:41.675 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node11 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Tuesday 19 August 2025 14:31:53 -0400 (0:00:00.441) 0:15:42.116 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node11 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Tuesday 19 August 2025 14:31:54 -0400 (0:00:00.457) 0:15:42.574 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Tuesday 19 August 2025 14:31:54 -0400 (0:00:00.121) 0:15:42.696 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Tuesday 19 August 2025 14:31:54 -0400 (0:00:00.167) 0:15:42.864 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Tuesday 19 August 2025 14:31:54 -0400 (0:00:00.230) 0:15:43.094 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Tuesday 19 August 2025 14:31:55 -0400 (0:00:00.204) 0:15:43.298 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node11 TASK [Set test variables] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Tuesday 19 August 2025 14:31:55 -0400 (0:00:00.646) 0:15:43.945 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Tuesday 19 August 2025 14:31:56 -0400 (0:00:00.233) 0:15:44.178 ******** skipping: [managed-node11] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Tuesday 19 August 2025 14:31:56 -0400 (0:00:00.136) 0:15:44.315 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node11 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Tuesday 19 August 2025 14:31:56 -0400 (0:00:00.252) 0:15:44.567 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Tuesday 19 August 2025 14:31:56 -0400 (0:00:00.193) 0:15:44.761 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Tuesday 19 August 2025 14:31:56 -0400 (0:00:00.232) 0:15:44.994 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Tuesday 19 August 2025 14:31:56 -0400 (0:00:00.095) 0:15:45.089 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Tuesday 19 August 2025 14:31:57 -0400 (0:00:00.134) 0:15:45.223 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Tuesday 19 August 2025 14:31:57 -0400 (0:00:00.123) 0:15:45.347 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Tuesday 19 August 2025 14:31:57 -0400 (0:00:00.577) 0:15:45.924 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Tuesday 19 August 2025 14:31:57 -0400 (0:00:00.184) 0:15:46.108 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node11 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Tuesday 19 August 2025 14:31:58 -0400 (0:00:00.395) 0:15:46.503 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node11 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Tuesday 19 August 2025 14:31:58 -0400 (0:00:00.431) 0:15:46.935 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Tuesday 19 August 2025 14:31:59 -0400 (0:00:00.360) 0:15:47.295 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Tuesday 19 August 2025 14:31:59 -0400 (0:00:00.264) 0:15:47.559 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Tuesday 19 August 2025 14:31:59 -0400 (0:00:00.201) 0:15:47.761 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off - 2] *********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Tuesday 19 August 2025 14:31:59 -0400 (0:00:00.238) 0:15:48.000 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on - 2] ************************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Tuesday 19 August 2025 14:32:00 -0400 (0:00:00.215) 0:15:48.215 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Tuesday 19 August 2025 14:32:00 -0400 (0:00:00.224) 0:15:48.439 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Tuesday 19 August 2025 14:32:00 -0400 (0:00:00.194) 0:15:48.634 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node11 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Tuesday 19 August 2025 14:32:01 -0400 (0:00:00.539) 0:15:49.174 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Tuesday 19 August 2025 14:32:01 -0400 (0:00:00.278) 0:15:49.453 ******** skipping: [managed-node11] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Tuesday 19 August 2025 14:32:01 -0400 (0:00:00.398) 0:15:49.851 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Tuesday 19 August 2025 14:32:01 -0400 (0:00:00.189) 0:15:50.040 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Tuesday 19 August 2025 14:32:02 -0400 (0:00:00.226) 0:15:50.267 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Tuesday 19 August 2025 14:32:02 -0400 (0:00:00.371) 0:15:50.639 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Tuesday 19 August 2025 14:32:02 -0400 (0:00:00.276) 0:15:50.916 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Tuesday 19 August 2025 14:32:02 -0400 (0:00:00.220) 0:15:51.136 ******** ok: [managed-node11] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Tuesday 19 August 2025 14:32:03 -0400 (0:00:00.280) 0:15:51.417 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node11 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Tuesday 19 August 2025 14:32:03 -0400 (0:00:00.374) 0:15:51.792 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Tuesday 19 August 2025 14:32:03 -0400 (0:00:00.191) 0:15:51.984 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node11 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Tuesday 19 August 2025 14:32:04 -0400 (0:00:01.109) 0:15:53.093 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-394e1e52-cec6-40c8-bf74-b0e069960529" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Tuesday 19 August 2025 14:32:05 -0400 (0:00:00.210) 0:15:53.303 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Tuesday 19 August 2025 14:32:05 -0400 (0:00:00.190) 0:15:53.494 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Tuesday 19 August 2025 14:32:05 -0400 (0:00:00.244) 0:15:53.738 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Tuesday 19 August 2025 14:32:05 -0400 (0:00:00.247) 0:15:53.986 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Tuesday 19 August 2025 14:32:06 -0400 (0:00:00.292) 0:15:54.278 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Tuesday 19 August 2025 14:32:06 -0400 (0:00:00.241) 0:15:54.520 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Tuesday 19 August 2025 14:32:06 -0400 (0:00:00.165) 0:15:54.686 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Tuesday 19 August 2025 14:32:06 -0400 (0:00:00.194) 0:15:54.881 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Tuesday 19 August 2025 14:32:06 -0400 (0:00:00.143) 0:15:55.024 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Tuesday 19 August 2025 14:32:07 -0400 (0:00:00.215) 0:15:55.240 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Tuesday 19 August 2025 14:32:07 -0400 (0:00:00.209) 0:15:55.449 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-394e1e52-cec6-40c8-bf74-b0e069960529 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Tuesday 19 August 2025 14:32:07 -0400 (0:00:00.419) 0:15:55.868 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Tuesday 19 August 2025 14:32:08 -0400 (0:00:00.304) 0:15:56.173 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Tuesday 19 August 2025 14:32:08 -0400 (0:00:00.317) 0:15:56.490 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Tuesday 19 August 2025 14:32:08 -0400 (0:00:00.235) 0:15:56.726 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Tuesday 19 August 2025 14:32:09 -0400 (0:00:00.696) 0:15:57.422 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Tuesday 19 August 2025 14:32:09 -0400 (0:00:00.211) 0:15:57.634 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Tuesday 19 August 2025 14:32:09 -0400 (0:00:00.252) 0:15:57.886 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Tuesday 19 August 2025 14:32:09 -0400 (0:00:00.251) 0:15:58.138 ******** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1755628272.6032732, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1755628272.6032732, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 235642, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1755628272.6032732, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Tuesday 19 August 2025 14:32:11 -0400 (0:00:01.469) 0:15:59.607 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Tuesday 19 August 2025 14:32:11 -0400 (0:00:00.387) 0:15:59.995 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Tuesday 19 August 2025 14:32:12 -0400 (0:00:00.302) 0:16:00.297 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Tuesday 19 August 2025 14:32:12 -0400 (0:00:00.332) 0:16:00.630 ******** ok: [managed-node11] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Tuesday 19 August 2025 14:32:12 -0400 (0:00:00.281) 0:16:00.911 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Tuesday 19 August 2025 14:32:13 -0400 (0:00:00.357) 0:16:01.268 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Tuesday 19 August 2025 14:32:13 -0400 (0:00:00.271) 0:16:01.540 ******** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1755628272.7462738, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1755628272.7462738, "dev": 6, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 235395, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1755628272.7462738, "nlink": 1, "path": "/dev/mapper/luks-394e1e52-cec6-40c8-bf74-b0e069960529", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Tuesday 19 August 2025 14:32:14 -0400 (0:00:01.374) 0:16:02.915 ******** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Tuesday 19 August 2025 14:32:19 -0400 (0:00:04.583) 0:16:07.499 ******** ok: [managed-node11] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.008905", "end": "2025-08-19 14:32:20.555359", "rc": 0, "start": "2025-08-19 14:32:20.546454" } STDOUT: LUKS header information for /dev/mapper/foo-test1 Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 16384 MK bits: 512 MK digest: 93 2e 4b c2 91 e1 89 af de f2 d4 0a 6a ac ba 47 3d 7d d5 6f MK salt: 8d fe 0d b7 68 e2 ae 90 1a 6c 61 5d f1 30 44 1d 0d b4 54 7d db e2 3a 56 b9 e6 f8 eb 4f 4d 27 59 MK iterations: 120249 UUID: 394e1e52-cec6-40c8-bf74-b0e069960529 Key Slot 0: ENABLED Iterations: 1923992 Salt: d9 72 fd 43 34 5f 24 69 25 91 05 8d 71 5a 06 00 d6 72 89 df 16 1a 42 5f d9 92 ba e1 2b a0 af ff Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Tuesday 19 August 2025 14:32:20 -0400 (0:00:01.486) 0:16:08.985 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Tuesday 19 August 2025 14:32:21 -0400 (0:00:00.352) 0:16:09.337 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Tuesday 19 August 2025 14:32:21 -0400 (0:00:00.378) 0:16:09.715 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Tuesday 19 August 2025 14:32:21 -0400 (0:00:00.291) 0:16:10.007 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Tuesday 19 August 2025 14:32:22 -0400 (0:00:00.209) 0:16:10.216 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Tuesday 19 August 2025 14:32:22 -0400 (0:00:00.365) 0:16:10.582 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Tuesday 19 August 2025 14:32:22 -0400 (0:00:00.204) 0:16:10.787 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Set test variables] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Tuesday 19 August 2025 14:32:23 -0400 (0:00:00.440) 0:16:11.228 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-394e1e52-cec6-40c8-bf74-b0e069960529 /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Tuesday 19 August 2025 14:32:23 -0400 (0:00:00.384) 0:16:11.612 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Tuesday 19 August 2025 14:32:23 -0400 (0:00:00.216) 0:16:11.829 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Tuesday 19 August 2025 14:32:23 -0400 (0:00:00.309) 0:16:12.138 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Tuesday 19 August 2025 14:32:24 -0400 (0:00:00.425) 0:16:12.564 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Tuesday 19 August 2025 14:32:24 -0400 (0:00:00.345) 0:16:12.909 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Tuesday 19 August 2025 14:32:24 -0400 (0:00:00.215) 0:16:13.125 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Tuesday 19 August 2025 14:32:25 -0400 (0:00:00.250) 0:16:13.375 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Tuesday 19 August 2025 14:32:25 -0400 (0:00:00.247) 0:16:13.622 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Tuesday 19 August 2025 14:32:25 -0400 (0:00:00.145) 0:16:13.768 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Tuesday 19 August 2025 14:32:25 -0400 (0:00:00.249) 0:16:14.018 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Tuesday 19 August 2025 14:32:26 -0400 (0:00:00.264) 0:16:14.282 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Tuesday 19 August 2025 14:32:26 -0400 (0:00:00.267) 0:16:14.550 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Tuesday 19 August 2025 14:32:26 -0400 (0:00:00.227) 0:16:14.777 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Tuesday 19 August 2025 14:32:26 -0400 (0:00:00.341) 0:16:15.119 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Tuesday 19 August 2025 14:32:27 -0400 (0:00:00.293) 0:16:15.412 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Tuesday 19 August 2025 14:32:27 -0400 (0:00:00.156) 0:16:15.569 ******** ok: [managed-node11] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Tuesday 19 August 2025 14:32:30 -0400 (0:00:02.966) 0:16:18.536 ******** ok: [managed-node11] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Tuesday 19 August 2025 14:32:31 -0400 (0:00:01.464) 0:16:20.000 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Tuesday 19 August 2025 14:32:32 -0400 (0:00:00.264) 0:16:20.264 ******** ok: [managed-node11] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Tuesday 19 August 2025 14:32:32 -0400 (0:00:00.224) 0:16:20.489 ******** ok: [managed-node11] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Tuesday 19 August 2025 14:32:33 -0400 (0:00:01.304) 0:16:21.793 ******** skipping: [managed-node11] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Tuesday 19 August 2025 14:32:34 -0400 (0:00:00.757) 0:16:22.551 ******** skipping: [managed-node11] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Tuesday 19 August 2025 14:32:34 -0400 (0:00:00.286) 0:16:22.838 ******** skipping: [managed-node11] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Tuesday 19 August 2025 14:32:34 -0400 (0:00:00.241) 0:16:23.079 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Tuesday 19 August 2025 14:32:35 -0400 (0:00:00.315) 0:16:23.394 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Tuesday 19 August 2025 14:32:35 -0400 (0:00:00.430) 0:16:23.825 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Tuesday 19 August 2025 14:32:36 -0400 (0:00:00.367) 0:16:24.193 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Tuesday 19 August 2025 14:32:36 -0400 (0:00:00.338) 0:16:24.532 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Tuesday 19 August 2025 14:32:36 -0400 (0:00:00.299) 0:16:24.832 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Tuesday 19 August 2025 14:32:37 -0400 (0:00:00.351) 0:16:25.184 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Tuesday 19 August 2025 14:32:37 -0400 (0:00:00.344) 0:16:25.528 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Tuesday 19 August 2025 14:32:37 -0400 (0:00:00.285) 0:16:25.814 ******** skipping: [managed-node11] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Tuesday 19 August 2025 14:32:37 -0400 (0:00:00.161) 0:16:25.976 ******** skipping: [managed-node11] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Tuesday 19 August 2025 14:32:38 -0400 (0:00:00.262) 0:16:26.238 ******** skipping: [managed-node11] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Tuesday 19 August 2025 14:32:38 -0400 (0:00:00.216) 0:16:26.454 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Tuesday 19 August 2025 14:32:38 -0400 (0:00:00.256) 0:16:26.711 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Tuesday 19 August 2025 14:32:38 -0400 (0:00:00.240) 0:16:26.951 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Tuesday 19 August 2025 14:32:39 -0400 (0:00:00.352) 0:16:27.304 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Tuesday 19 August 2025 14:32:39 -0400 (0:00:00.238) 0:16:27.542 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Tuesday 19 August 2025 14:32:39 -0400 (0:00:00.298) 0:16:27.841 ******** ok: [managed-node11] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Tuesday 19 August 2025 14:32:40 -0400 (0:00:00.301) 0:16:28.142 ******** ok: [managed-node11] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Tuesday 19 August 2025 14:32:40 -0400 (0:00:00.242) 0:16:28.385 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Tuesday 19 August 2025 14:32:40 -0400 (0:00:00.373) 0:16:28.759 ******** ok: [managed-node11] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.021049", "end": "2025-08-19 14:32:42.000617", "rc": 0, "start": "2025-08-19 14:32:41.979568" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Tuesday 19 August 2025 14:32:42 -0400 (0:00:01.712) 0:16:30.471 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Tuesday 19 August 2025 14:32:42 -0400 (0:00:00.264) 0:16:30.735 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Tuesday 19 August 2025 14:32:42 -0400 (0:00:00.328) 0:16:31.064 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Tuesday 19 August 2025 14:32:43 -0400 (0:00:00.288) 0:16:31.352 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Tuesday 19 August 2025 14:32:43 -0400 (0:00:00.302) 0:16:31.655 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Tuesday 19 August 2025 14:32:43 -0400 (0:00:00.188) 0:16:31.843 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Tuesday 19 August 2025 14:32:43 -0400 (0:00:00.206) 0:16:32.050 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Tuesday 19 August 2025 14:32:44 -0400 (0:00:00.196) 0:16:32.246 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Tuesday 19 August 2025 14:32:44 -0400 (0:00:00.284) 0:16:32.531 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Verify preservation of encryption settings on existing LVM volume] ******* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:399 Tuesday 19 August 2025 14:32:44 -0400 (0:00:00.220) 0:16:32.752 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Tuesday 19 August 2025 14:32:45 -0400 (0:00:00.596) 0:16:33.349 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Tuesday 19 August 2025 14:32:45 -0400 (0:00:00.417) 0:16:33.766 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Tuesday 19 August 2025 14:32:45 -0400 (0:00:00.344) 0:16:34.110 ******** skipping: [managed-node11] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node11] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Tuesday 19 August 2025 14:32:46 -0400 (0:00:00.479) 0:16:34.590 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Tuesday 19 August 2025 14:32:46 -0400 (0:00:00.316) 0:16:34.907 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Tuesday 19 August 2025 14:32:47 -0400 (0:00:00.359) 0:16:35.267 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Tuesday 19 August 2025 14:32:47 -0400 (0:00:00.182) 0:16:35.449 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Tuesday 19 August 2025 14:32:47 -0400 (0:00:00.215) 0:16:35.665 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Tuesday 19 August 2025 14:32:48 -0400 (0:00:00.493) 0:16:36.159 ******** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Tuesday 19 August 2025 14:32:52 -0400 (0:00:04.507) 0:16:40.666 ******** ok: [managed-node11] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Tuesday 19 August 2025 14:32:52 -0400 (0:00:00.220) 0:16:40.887 ******** ok: [managed-node11] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Tuesday 19 August 2025 14:32:52 -0400 (0:00:00.214) 0:16:41.102 ******** ok: [managed-node11] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Tuesday 19 August 2025 14:32:58 -0400 (0:00:05.382) 0:16:46.484 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Tuesday 19 August 2025 14:32:58 -0400 (0:00:00.372) 0:16:46.856 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Tuesday 19 August 2025 14:32:58 -0400 (0:00:00.211) 0:16:47.068 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Tuesday 19 August 2025 14:32:59 -0400 (0:00:00.280) 0:16:47.349 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Tuesday 19 August 2025 14:32:59 -0400 (0:00:00.695) 0:16:48.044 ******** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Tuesday 19 August 2025 14:33:04 -0400 (0:00:04.854) 0:16:52.898 ******** ok: [managed-node11] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...d0f7b\\x2d4918\\x2da9fe\\x2db8223790efc2.service": { "name": "systemd-cryptsetup@luk...d0f7b\\x2d4918\\x2da9fe\\x2db8223790efc2.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2dac72a752\\x2d0f7b\\x2d4918\\x2da9fe\\x2db8223790efc2.service": { "name": "systemd-cryptsetup@luks\\x2dac72a752\\x2d0f7b\\x2d4918\\x2da9fe\\x2db8223790efc2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Tuesday 19 August 2025 14:33:07 -0400 (0:00:02.886) 0:16:55.785 ******** ok: [managed-node11] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2dac72a752\\x2d0f7b\\x2d4918\\x2da9fe\\x2db8223790efc2.service", "systemd-cryptsetup@luk...d0f7b\\x2d4918\\x2da9fe\\x2db8223790efc2.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Tuesday 19 August 2025 14:33:08 -0400 (0:00:00.441) 0:16:56.226 ******** changed: [managed-node11] => (item=systemd-cryptsetup@luks\x2dac72a752\x2d0f7b\x2d4918\x2da9fe\x2db8223790efc2.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dac72a752\\x2d0f7b\\x2d4918\\x2da9fe\\x2db8223790efc2.service", "name": "systemd-cryptsetup@luks\\x2dac72a752\\x2d0f7b\\x2d4918\\x2da9fe\\x2db8223790efc2.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "tmp.mount cryptsetup-pre.target -.mount dev-sda1.device systemd-journald.socket system-systemd\\x2dcryptsetup.slice", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-ac72a752-0f7b-4918-a9fe-b8223790efc2", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-ac72a752-0f7b-4918-a9fe-b8223790efc2 /dev/sda1 /tmp/storage_testkbs2wr22lukskey ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-ac72a752-0f7b-4918-a9fe-b8223790efc2 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2dac72a752\\x2d0f7b\\x2d4918\\x2da9fe\\x2db8223790efc2.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2dac72a752\\x2d0f7b\\x2d4918\\x2da9fe\\x2db8223790efc2.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2dac72a752\\x2d0f7b\\x2d4918\\x2da9fe\\x2db8223790efc2.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice -.mount", "RequiresMountsFor": "/tmp/storage_testkbs2wr22lukskey", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Tue 2025-08-19 14:31:24 EDT", "StateChangeTimestampMonotonic": "2601147160", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node11] => (item=systemd-cryptsetup@luk...d0f7b\x2d4918\x2da9fe\x2db8223790efc2.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d0f7b\\x2d4918\\x2da9fe\\x2db8223790efc2.service", "name": "systemd-cryptsetup@luk...d0f7b\\x2d4918\\x2da9fe\\x2db8223790efc2.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d0f7b\\x2d4918\\x2da9fe\\x2db8223790efc2.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d0f7b\\x2d4918\\x2da9fe\\x2db8223790efc2.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...d0f7b\\x2d4918\\x2da9fe\\x2db8223790efc2.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d0f7b\\x2d4918\\x2da9fe\\x2db8223790efc2.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Tuesday 19 August 2025 14:33:11 -0400 (0:00:03.303) 0:16:59.530 ******** ok: [managed-node11] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-394e1e52-cec6-40c8-bf74-b0e069960529" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-394e1e52-cec6-40c8-bf74-b0e069960529", "state": "mounted" } ], "packages": [ "lvm2", "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-394e1e52-cec6-40c8-bf74-b0e069960529", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-394e1e52-cec6-40c8-bf74-b0e069960529", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Tuesday 19 August 2025 14:33:17 -0400 (0:00:05.641) 0:17:05.172 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Tuesday 19 August 2025 14:33:17 -0400 (0:00:00.233) 0:17:05.405 ******** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1755628282.175327, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "2136b168aed65fb5d3269ecd9035b51807a5a1b4", "ctime": 1755628282.172327, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 364904586, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1755628282.172327, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "2238694571", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Tuesday 19 August 2025 14:33:18 -0400 (0:00:01.393) 0:17:06.799 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Tuesday 19 August 2025 14:33:18 -0400 (0:00:00.203) 0:17:07.002 ******** changed: [managed-node11] => (item=systemd-cryptsetup@luks\x2dac72a752\x2d0f7b\x2d4918\x2da9fe\x2db8223790efc2.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dac72a752\\x2d0f7b\\x2d4918\\x2da9fe\\x2db8223790efc2.service", "name": "systemd-cryptsetup@luks\\x2dac72a752\\x2d0f7b\\x2d4918\\x2da9fe\\x2db8223790efc2.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2dac72a752\\x2d0f7b\\x2d4918\\x2da9fe\\x2db8223790efc2.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2dac72a752\\x2d0f7b\\x2d4918\\x2da9fe\\x2db8223790efc2.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2dac72a752\\x2d0f7b\\x2d4918\\x2da9fe\\x2db8223790efc2.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2dac72a752\\x2d0f7b\\x2d4918\\x2da9fe\\x2db8223790efc2.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node11] => (item=systemd-cryptsetup@luk...d0f7b\x2d4918\x2da9fe\x2db8223790efc2.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d0f7b\\x2d4918\\x2da9fe\\x2db8223790efc2.service", "name": "systemd-cryptsetup@luk...d0f7b\\x2d4918\\x2da9fe\\x2db8223790efc2.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d0f7b\\x2d4918\\x2da9fe\\x2db8223790efc2.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d0f7b\\x2d4918\\x2da9fe\\x2db8223790efc2.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...d0f7b\\x2d4918\\x2da9fe\\x2db8223790efc2.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d0f7b\\x2d4918\\x2da9fe\\x2db8223790efc2.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Tuesday 19 August 2025 14:33:22 -0400 (0:00:03.605) 0:17:10.608 ******** ok: [managed-node11] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-394e1e52-cec6-40c8-bf74-b0e069960529" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-394e1e52-cec6-40c8-bf74-b0e069960529", "state": "mounted" } ], "packages": [ "lvm2", "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-394e1e52-cec6-40c8-bf74-b0e069960529", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-394e1e52-cec6-40c8-bf74-b0e069960529", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Tuesday 19 August 2025 14:33:22 -0400 (0:00:00.196) 0:17:10.804 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-394e1e52-cec6-40c8-bf74-b0e069960529", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-394e1e52-cec6-40c8-bf74-b0e069960529", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Tuesday 19 August 2025 14:33:22 -0400 (0:00:00.215) 0:17:11.019 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Tuesday 19 August 2025 14:33:23 -0400 (0:00:00.296) 0:17:11.316 ******** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Tuesday 19 August 2025 14:33:23 -0400 (0:00:00.277) 0:17:11.594 ******** ok: [managed-node11] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Tuesday 19 August 2025 14:33:25 -0400 (0:00:01.823) 0:17:13.417 ******** ok: [managed-node11] => (item={'src': '/dev/mapper/luks-394e1e52-cec6-40c8-bf74-b0e069960529', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-394e1e52-cec6-40c8-bf74-b0e069960529", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-394e1e52-cec6-40c8-bf74-b0e069960529" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Tuesday 19 August 2025 14:33:27 -0400 (0:00:02.078) 0:17:15.496 ******** skipping: [managed-node11] => (item={'src': '/dev/mapper/luks-394e1e52-cec6-40c8-bf74-b0e069960529', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-394e1e52-cec6-40c8-bf74-b0e069960529", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Tuesday 19 August 2025 14:33:27 -0400 (0:00:00.348) 0:17:15.844 ******** ok: [managed-node11] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Tuesday 19 August 2025 14:33:29 -0400 (0:00:01.939) 0:17:17.784 ******** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1755628296.2114062, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "e7aae20622194b7b543ec40aabdd2b0d35f8d446", "ctime": 1755628288.542363, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 511705223, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1755628288.542363, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "2972772328", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Tuesday 19 August 2025 14:33:31 -0400 (0:00:01.573) 0:17:19.357 ******** TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Tuesday 19 August 2025 14:33:31 -0400 (0:00:00.186) 0:17:19.544 ******** ok: [managed-node11] TASK [Assert preservation of encryption settings on existing LVM volume] ******* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:413 Tuesday 19 August 2025 14:33:33 -0400 (0:00:01.981) 0:17:21.525 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify role results - 8] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:420 Tuesday 19 August 2025 14:33:33 -0400 (0:00:00.490) 0:17:22.016 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node11 TASK [Print out pool information] ********************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Tuesday 19 August 2025 14:33:34 -0400 (0:00:00.405) 0:17:22.421 ******** ok: [managed-node11] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-394e1e52-cec6-40c8-bf74-b0e069960529", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-394e1e52-cec6-40c8-bf74-b0e069960529", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Tuesday 19 August 2025 14:33:35 -0400 (0:00:01.057) 0:17:23.478 ******** skipping: [managed-node11] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Tuesday 19 August 2025 14:33:35 -0400 (0:00:00.283) 0:17:23.762 ******** ok: [managed-node11] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "394e1e52-cec6-40c8-bf74-b0e069960529" }, "/dev/mapper/luks-394e1e52-cec6-40c8-bf74-b0e069960529": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-394e1e52-cec6-40c8-bf74-b0e069960529", "size": "4G", "type": "crypt", "uuid": "2755a277-d835-466e-bf9a-e993d94a3060" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "rZPaos-Ceok-3XVM-1kaP-jB95-fFYd-ITZHBi" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Tuesday 19 August 2025 14:33:37 -0400 (0:00:01.660) 0:17:25.423 ******** ok: [managed-node11] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002476", "end": "2025-08-19 14:33:38.491884", "rc": 0, "start": "2025-08-19 14:33:38.489408" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-394e1e52-cec6-40c8-bf74-b0e069960529 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Tuesday 19 August 2025 14:33:38 -0400 (0:00:01.483) 0:17:26.906 ******** ok: [managed-node11] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002236", "end": "2025-08-19 14:33:39.949245", "failed_when_result": false, "rc": 0, "start": "2025-08-19 14:33:39.947009" } STDOUT: luks-394e1e52-cec6-40c8-bf74-b0e069960529 /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Tuesday 19 August 2025 14:33:40 -0400 (0:00:01.424) 0:17:28.331 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node11 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Tuesday 19 August 2025 14:33:40 -0400 (0:00:00.321) 0:17:28.652 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Tuesday 19 August 2025 14:33:40 -0400 (0:00:00.199) 0:17:28.852 ******** ok: [managed-node11] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.021475", "end": "2025-08-19 14:33:41.841531", "rc": 0, "start": "2025-08-19 14:33:41.820056" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Tuesday 19 August 2025 14:33:42 -0400 (0:00:01.472) 0:17:30.325 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Tuesday 19 August 2025 14:33:42 -0400 (0:00:00.222) 0:17:30.547 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node11 TASK [Set test variables] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Tuesday 19 August 2025 14:33:42 -0400 (0:00:00.353) 0:17:30.901 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Tuesday 19 August 2025 14:33:43 -0400 (0:00:00.246) 0:17:31.147 ******** ok: [managed-node11] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Tuesday 19 August 2025 14:33:44 -0400 (0:00:01.236) 0:17:32.384 ******** ok: [managed-node11] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Tuesday 19 August 2025 14:33:44 -0400 (0:00:00.226) 0:17:32.611 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Tuesday 19 August 2025 14:33:44 -0400 (0:00:00.218) 0:17:32.830 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Tuesday 19 August 2025 14:33:44 -0400 (0:00:00.284) 0:17:33.114 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Tuesday 19 August 2025 14:33:45 -0400 (0:00:00.187) 0:17:33.303 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Tuesday 19 August 2025 14:33:45 -0400 (0:00:00.216) 0:17:33.519 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Tuesday 19 August 2025 14:33:45 -0400 (0:00:00.208) 0:17:33.728 ******** ok: [managed-node11] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Tuesday 19 August 2025 14:33:45 -0400 (0:00:00.398) 0:17:34.127 ******** ok: [managed-node11] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.43.133 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Tuesday 19 August 2025 14:33:47 -0400 (0:00:01.473) 0:17:35.600 ******** skipping: [managed-node11] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Tuesday 19 August 2025 14:33:47 -0400 (0:00:00.306) 0:17:35.907 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node11 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Tuesday 19 August 2025 14:33:47 -0400 (0:00:00.226) 0:17:36.133 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Tuesday 19 August 2025 14:33:48 -0400 (0:00:00.096) 0:17:36.230 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Tuesday 19 August 2025 14:33:48 -0400 (0:00:00.059) 0:17:36.290 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Tuesday 19 August 2025 14:33:48 -0400 (0:00:00.129) 0:17:36.419 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Tuesday 19 August 2025 14:33:48 -0400 (0:00:00.052) 0:17:36.472 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Tuesday 19 August 2025 14:33:48 -0400 (0:00:00.052) 0:17:36.525 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Tuesday 19 August 2025 14:33:48 -0400 (0:00:00.128) 0:17:36.653 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Tuesday 19 August 2025 14:33:48 -0400 (0:00:00.239) 0:17:36.893 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Tuesday 19 August 2025 14:33:49 -0400 (0:00:00.267) 0:17:37.161 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Tuesday 19 August 2025 14:33:49 -0400 (0:00:00.219) 0:17:37.380 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Tuesday 19 August 2025 14:33:49 -0400 (0:00:00.307) 0:17:37.687 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Tuesday 19 August 2025 14:33:49 -0400 (0:00:00.232) 0:17:37.920 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node11 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Tuesday 19 August 2025 14:33:50 -0400 (0:00:00.526) 0:17:38.446 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node11 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Tuesday 19 August 2025 14:33:51 -0400 (0:00:00.837) 0:17:39.284 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Tuesday 19 August 2025 14:33:51 -0400 (0:00:00.314) 0:17:39.599 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Tuesday 19 August 2025 14:33:51 -0400 (0:00:00.276) 0:17:39.875 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Tuesday 19 August 2025 14:33:51 -0400 (0:00:00.222) 0:17:40.098 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Tuesday 19 August 2025 14:33:52 -0400 (0:00:00.204) 0:17:40.302 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Tuesday 19 August 2025 14:33:52 -0400 (0:00:00.234) 0:17:40.536 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Tuesday 19 August 2025 14:33:52 -0400 (0:00:00.182) 0:17:40.718 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Tuesday 19 August 2025 14:33:52 -0400 (0:00:00.227) 0:17:40.946 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node11 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Tuesday 19 August 2025 14:33:53 -0400 (0:00:00.363) 0:17:41.310 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node11 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Tuesday 19 August 2025 14:33:53 -0400 (0:00:00.287) 0:17:41.597 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Tuesday 19 August 2025 14:33:53 -0400 (0:00:00.143) 0:17:41.740 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Tuesday 19 August 2025 14:33:53 -0400 (0:00:00.107) 0:17:41.848 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Tuesday 19 August 2025 14:33:53 -0400 (0:00:00.175) 0:17:42.023 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Tuesday 19 August 2025 14:33:54 -0400 (0:00:00.148) 0:17:42.172 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node11 TASK [Set test variables] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Tuesday 19 August 2025 14:33:54 -0400 (0:00:00.512) 0:17:42.684 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Tuesday 19 August 2025 14:33:54 -0400 (0:00:00.314) 0:17:42.998 ******** skipping: [managed-node11] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Tuesday 19 August 2025 14:33:55 -0400 (0:00:00.189) 0:17:43.188 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node11 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Tuesday 19 August 2025 14:33:55 -0400 (0:00:00.513) 0:17:43.701 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Tuesday 19 August 2025 14:33:55 -0400 (0:00:00.258) 0:17:43.960 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Tuesday 19 August 2025 14:33:55 -0400 (0:00:00.156) 0:17:44.117 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Tuesday 19 August 2025 14:33:56 -0400 (0:00:00.103) 0:17:44.220 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Tuesday 19 August 2025 14:33:56 -0400 (0:00:00.169) 0:17:44.390 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Tuesday 19 August 2025 14:33:56 -0400 (0:00:00.200) 0:17:44.590 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Tuesday 19 August 2025 14:33:56 -0400 (0:00:00.165) 0:17:44.755 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Tuesday 19 August 2025 14:33:56 -0400 (0:00:00.103) 0:17:44.859 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node11 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Tuesday 19 August 2025 14:33:57 -0400 (0:00:00.472) 0:17:45.332 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node11 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Tuesday 19 August 2025 14:33:57 -0400 (0:00:00.283) 0:17:45.615 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Tuesday 19 August 2025 14:33:57 -0400 (0:00:00.248) 0:17:45.864 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Tuesday 19 August 2025 14:33:57 -0400 (0:00:00.192) 0:17:46.056 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Tuesday 19 August 2025 14:33:58 -0400 (0:00:00.275) 0:17:46.332 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off - 2] *********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Tuesday 19 August 2025 14:33:58 -0400 (0:00:00.209) 0:17:46.542 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on - 2] ************************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Tuesday 19 August 2025 14:33:58 -0400 (0:00:00.288) 0:17:46.830 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Tuesday 19 August 2025 14:33:58 -0400 (0:00:00.283) 0:17:47.114 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Tuesday 19 August 2025 14:33:59 -0400 (0:00:00.250) 0:17:47.365 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node11 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Tuesday 19 August 2025 14:33:59 -0400 (0:00:00.550) 0:17:47.915 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Tuesday 19 August 2025 14:34:00 -0400 (0:00:00.263) 0:17:48.179 ******** skipping: [managed-node11] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Tuesday 19 August 2025 14:34:00 -0400 (0:00:00.138) 0:17:48.318 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Tuesday 19 August 2025 14:34:00 -0400 (0:00:00.198) 0:17:48.516 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Tuesday 19 August 2025 14:34:00 -0400 (0:00:00.167) 0:17:48.683 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Tuesday 19 August 2025 14:34:00 -0400 (0:00:00.287) 0:17:48.971 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Tuesday 19 August 2025 14:34:01 -0400 (0:00:00.605) 0:17:49.576 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Tuesday 19 August 2025 14:34:01 -0400 (0:00:00.274) 0:17:49.851 ******** ok: [managed-node11] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Tuesday 19 August 2025 14:34:01 -0400 (0:00:00.141) 0:17:49.992 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node11 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Tuesday 19 August 2025 14:34:02 -0400 (0:00:00.514) 0:17:50.506 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Tuesday 19 August 2025 14:34:02 -0400 (0:00:00.248) 0:17:50.755 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node11 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Tuesday 19 August 2025 14:34:03 -0400 (0:00:01.195) 0:17:51.950 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-394e1e52-cec6-40c8-bf74-b0e069960529" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Tuesday 19 August 2025 14:34:04 -0400 (0:00:00.281) 0:17:52.232 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Tuesday 19 August 2025 14:34:04 -0400 (0:00:00.309) 0:17:52.541 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Tuesday 19 August 2025 14:34:04 -0400 (0:00:00.440) 0:17:52.981 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Tuesday 19 August 2025 14:34:05 -0400 (0:00:00.214) 0:17:53.195 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Tuesday 19 August 2025 14:34:05 -0400 (0:00:00.236) 0:17:53.432 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Tuesday 19 August 2025 14:34:05 -0400 (0:00:00.210) 0:17:53.643 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Tuesday 19 August 2025 14:34:05 -0400 (0:00:00.308) 0:17:53.952 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Tuesday 19 August 2025 14:34:06 -0400 (0:00:00.206) 0:17:54.158 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Tuesday 19 August 2025 14:34:06 -0400 (0:00:00.227) 0:17:54.386 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Tuesday 19 August 2025 14:34:06 -0400 (0:00:00.176) 0:17:54.562 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Tuesday 19 August 2025 14:34:06 -0400 (0:00:00.276) 0:17:54.839 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-394e1e52-cec6-40c8-bf74-b0e069960529 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Tuesday 19 August 2025 14:34:07 -0400 (0:00:00.372) 0:17:55.211 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Tuesday 19 August 2025 14:34:07 -0400 (0:00:00.183) 0:17:55.395 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Tuesday 19 August 2025 14:34:07 -0400 (0:00:00.206) 0:17:55.601 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Tuesday 19 August 2025 14:34:07 -0400 (0:00:00.236) 0:17:55.838 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Tuesday 19 August 2025 14:34:07 -0400 (0:00:00.253) 0:17:56.091 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Tuesday 19 August 2025 14:34:08 -0400 (0:00:00.216) 0:17:56.307 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Tuesday 19 August 2025 14:34:08 -0400 (0:00:00.347) 0:17:56.655 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Tuesday 19 August 2025 14:34:08 -0400 (0:00:00.209) 0:17:56.864 ******** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1755628340.550655, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1755628272.6032732, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 235642, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1755628272.6032732, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Tuesday 19 August 2025 14:34:10 -0400 (0:00:01.330) 0:17:58.194 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Tuesday 19 August 2025 14:34:10 -0400 (0:00:00.208) 0:17:58.402 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Tuesday 19 August 2025 14:34:10 -0400 (0:00:00.207) 0:17:58.610 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Tuesday 19 August 2025 14:34:10 -0400 (0:00:00.193) 0:17:58.804 ******** ok: [managed-node11] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Tuesday 19 August 2025 14:34:10 -0400 (0:00:00.195) 0:17:58.999 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Tuesday 19 August 2025 14:34:11 -0400 (0:00:00.225) 0:17:59.224 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Tuesday 19 August 2025 14:34:11 -0400 (0:00:00.294) 0:17:59.519 ******** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1755628396.6589699, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1755628272.7462738, "dev": 6, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 235395, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1755628272.7462738, "nlink": 1, "path": "/dev/mapper/luks-394e1e52-cec6-40c8-bf74-b0e069960529", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Tuesday 19 August 2025 14:34:12 -0400 (0:00:01.487) 0:18:01.006 ******** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Tuesday 19 August 2025 14:34:17 -0400 (0:00:04.332) 0:18:05.339 ******** ok: [managed-node11] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.009260", "end": "2025-08-19 14:34:18.272839", "rc": 0, "start": "2025-08-19 14:34:18.263579" } STDOUT: LUKS header information for /dev/mapper/foo-test1 Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 16384 MK bits: 512 MK digest: 93 2e 4b c2 91 e1 89 af de f2 d4 0a 6a ac ba 47 3d 7d d5 6f MK salt: 8d fe 0d b7 68 e2 ae 90 1a 6c 61 5d f1 30 44 1d 0d b4 54 7d db e2 3a 56 b9 e6 f8 eb 4f 4d 27 59 MK iterations: 120249 UUID: 394e1e52-cec6-40c8-bf74-b0e069960529 Key Slot 0: ENABLED Iterations: 1923992 Salt: d9 72 fd 43 34 5f 24 69 25 91 05 8d 71 5a 06 00 d6 72 89 df 16 1a 42 5f d9 92 ba e1 2b a0 af ff Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Tuesday 19 August 2025 14:34:18 -0400 (0:00:01.378) 0:18:06.717 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Tuesday 19 August 2025 14:34:18 -0400 (0:00:00.282) 0:18:06.999 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Tuesday 19 August 2025 14:34:19 -0400 (0:00:00.256) 0:18:07.255 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Tuesday 19 August 2025 14:34:19 -0400 (0:00:00.287) 0:18:07.543 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Tuesday 19 August 2025 14:34:19 -0400 (0:00:00.198) 0:18:07.741 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Tuesday 19 August 2025 14:34:19 -0400 (0:00:00.222) 0:18:07.964 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Tuesday 19 August 2025 14:34:19 -0400 (0:00:00.172) 0:18:08.137 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Tuesday 19 August 2025 14:34:20 -0400 (0:00:00.245) 0:18:08.382 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-394e1e52-cec6-40c8-bf74-b0e069960529 /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Tuesday 19 August 2025 14:34:20 -0400 (0:00:00.091) 0:18:08.473 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Tuesday 19 August 2025 14:34:20 -0400 (0:00:00.176) 0:18:08.650 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Tuesday 19 August 2025 14:34:20 -0400 (0:00:00.245) 0:18:08.896 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Tuesday 19 August 2025 14:34:20 -0400 (0:00:00.221) 0:18:09.117 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Tuesday 19 August 2025 14:34:21 -0400 (0:00:00.172) 0:18:09.290 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Tuesday 19 August 2025 14:34:21 -0400 (0:00:00.186) 0:18:09.476 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Tuesday 19 August 2025 14:34:21 -0400 (0:00:00.127) 0:18:09.603 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Tuesday 19 August 2025 14:34:21 -0400 (0:00:00.182) 0:18:09.786 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Tuesday 19 August 2025 14:34:21 -0400 (0:00:00.162) 0:18:09.949 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Tuesday 19 August 2025 14:34:21 -0400 (0:00:00.167) 0:18:10.117 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Tuesday 19 August 2025 14:34:22 -0400 (0:00:00.206) 0:18:10.323 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Tuesday 19 August 2025 14:34:22 -0400 (0:00:00.182) 0:18:10.505 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Tuesday 19 August 2025 14:34:22 -0400 (0:00:00.582) 0:18:11.088 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Tuesday 19 August 2025 14:34:23 -0400 (0:00:00.178) 0:18:11.266 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Tuesday 19 August 2025 14:34:23 -0400 (0:00:00.083) 0:18:11.349 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Tuesday 19 August 2025 14:34:23 -0400 (0:00:00.154) 0:18:11.504 ******** ok: [managed-node11] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Tuesday 19 August 2025 14:34:24 -0400 (0:00:00.927) 0:18:12.432 ******** ok: [managed-node11] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Tuesday 19 August 2025 14:34:25 -0400 (0:00:01.026) 0:18:13.459 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Tuesday 19 August 2025 14:34:25 -0400 (0:00:00.256) 0:18:13.715 ******** ok: [managed-node11] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Tuesday 19 August 2025 14:34:25 -0400 (0:00:00.148) 0:18:13.864 ******** ok: [managed-node11] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Tuesday 19 August 2025 14:34:26 -0400 (0:00:00.890) 0:18:14.754 ******** skipping: [managed-node11] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Tuesday 19 August 2025 14:34:26 -0400 (0:00:00.164) 0:18:14.919 ******** skipping: [managed-node11] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Tuesday 19 August 2025 14:34:26 -0400 (0:00:00.153) 0:18:15.072 ******** skipping: [managed-node11] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Tuesday 19 August 2025 14:34:27 -0400 (0:00:00.242) 0:18:15.314 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Tuesday 19 August 2025 14:34:27 -0400 (0:00:00.195) 0:18:15.509 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Tuesday 19 August 2025 14:34:27 -0400 (0:00:00.294) 0:18:15.804 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Tuesday 19 August 2025 14:34:27 -0400 (0:00:00.233) 0:18:16.037 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Tuesday 19 August 2025 14:34:28 -0400 (0:00:00.208) 0:18:16.246 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Tuesday 19 August 2025 14:34:28 -0400 (0:00:00.229) 0:18:16.475 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Tuesday 19 August 2025 14:34:28 -0400 (0:00:00.287) 0:18:16.763 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Tuesday 19 August 2025 14:34:28 -0400 (0:00:00.223) 0:18:16.987 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Tuesday 19 August 2025 14:34:29 -0400 (0:00:00.168) 0:18:17.155 ******** skipping: [managed-node11] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Tuesday 19 August 2025 14:34:29 -0400 (0:00:00.117) 0:18:17.273 ******** skipping: [managed-node11] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Tuesday 19 August 2025 14:34:29 -0400 (0:00:00.134) 0:18:17.408 ******** skipping: [managed-node11] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Tuesday 19 August 2025 14:34:29 -0400 (0:00:00.116) 0:18:17.524 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Tuesday 19 August 2025 14:34:29 -0400 (0:00:00.178) 0:18:17.702 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Tuesday 19 August 2025 14:34:29 -0400 (0:00:00.210) 0:18:17.912 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Tuesday 19 August 2025 14:34:29 -0400 (0:00:00.174) 0:18:18.087 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Tuesday 19 August 2025 14:34:30 -0400 (0:00:00.179) 0:18:18.266 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Tuesday 19 August 2025 14:34:30 -0400 (0:00:00.176) 0:18:18.442 ******** ok: [managed-node11] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Tuesday 19 August 2025 14:34:30 -0400 (0:00:00.236) 0:18:18.678 ******** ok: [managed-node11] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Tuesday 19 August 2025 14:34:30 -0400 (0:00:00.256) 0:18:18.934 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Tuesday 19 August 2025 14:34:31 -0400 (0:00:00.273) 0:18:19.208 ******** ok: [managed-node11] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.029669", "end": "2025-08-19 14:34:32.070490", "rc": 0, "start": "2025-08-19 14:34:32.040821" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Tuesday 19 August 2025 14:34:32 -0400 (0:00:01.169) 0:18:20.377 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Tuesday 19 August 2025 14:34:32 -0400 (0:00:00.197) 0:18:20.575 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Tuesday 19 August 2025 14:34:32 -0400 (0:00:00.176) 0:18:20.751 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Tuesday 19 August 2025 14:34:32 -0400 (0:00:00.203) 0:18:20.954 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Tuesday 19 August 2025 14:34:33 -0400 (0:00:00.208) 0:18:21.163 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Tuesday 19 August 2025 14:34:33 -0400 (0:00:00.182) 0:18:21.346 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Tuesday 19 August 2025 14:34:33 -0400 (0:00:00.226) 0:18:21.572 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Tuesday 19 August 2025 14:34:33 -0400 (0:00:00.216) 0:18:21.789 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Tuesday 19 August 2025 14:34:33 -0400 (0:00:00.197) 0:18:21.987 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Tuesday 19 August 2025 14:34:34 -0400 (0:00:00.191) 0:18:22.178 ******** changed: [managed-node11] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 5] ****************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:426 Tuesday 19 August 2025 14:34:35 -0400 (0:00:01.072) 0:18:23.251 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node11 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Tuesday 19 August 2025 14:34:35 -0400 (0:00:00.322) 0:18:23.573 ******** ok: [managed-node11] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Tuesday 19 August 2025 14:34:35 -0400 (0:00:00.273) 0:18:23.847 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Tuesday 19 August 2025 14:34:35 -0400 (0:00:00.243) 0:18:24.090 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Tuesday 19 August 2025 14:34:36 -0400 (0:00:00.149) 0:18:24.240 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Tuesday 19 August 2025 14:34:36 -0400 (0:00:00.180) 0:18:24.421 ******** skipping: [managed-node11] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node11] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Tuesday 19 August 2025 14:34:36 -0400 (0:00:00.414) 0:18:24.835 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Tuesday 19 August 2025 14:34:37 -0400 (0:00:00.322) 0:18:25.158 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Tuesday 19 August 2025 14:34:37 -0400 (0:00:00.220) 0:18:25.379 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Tuesday 19 August 2025 14:34:37 -0400 (0:00:00.195) 0:18:25.574 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Tuesday 19 August 2025 14:34:37 -0400 (0:00:00.183) 0:18:25.758 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Tuesday 19 August 2025 14:34:37 -0400 (0:00:00.342) 0:18:26.101 ******** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Tuesday 19 August 2025 14:34:42 -0400 (0:00:04.142) 0:18:30.243 ******** ok: [managed-node11] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Tuesday 19 August 2025 14:34:42 -0400 (0:00:00.212) 0:18:30.456 ******** ok: [managed-node11] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Tuesday 19 August 2025 14:34:42 -0400 (0:00:00.214) 0:18:30.671 ******** ok: [managed-node11] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Tuesday 19 August 2025 14:34:47 -0400 (0:00:05.256) 0:18:35.927 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Tuesday 19 August 2025 14:34:48 -0400 (0:00:00.353) 0:18:36.281 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Tuesday 19 August 2025 14:34:48 -0400 (0:00:00.178) 0:18:36.459 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Tuesday 19 August 2025 14:34:48 -0400 (0:00:00.094) 0:18:36.553 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Tuesday 19 August 2025 14:34:48 -0400 (0:00:00.124) 0:18:36.678 ******** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Tuesday 19 August 2025 14:34:52 -0400 (0:00:04.087) 0:18:40.766 ******** ok: [managed-node11] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service": { "name": "systemd-cryptsetup@luk...dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2d394e1e52\\x2dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service": { "name": "systemd-cryptsetup@luks\\x2d394e1e52\\x2dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Tuesday 19 August 2025 14:34:55 -0400 (0:00:02.548) 0:18:43.314 ******** ok: [managed-node11] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d394e1e52\\x2dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "systemd-cryptsetup@luk...dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Tuesday 19 August 2025 14:34:55 -0400 (0:00:00.391) 0:18:43.705 ******** changed: [managed-node11] => (item=systemd-cryptsetup@luks\x2d394e1e52\x2dcec6\x2d40c8\x2dbf74\x2db0e069960529.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d394e1e52\\x2dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "name": "systemd-cryptsetup@luks\\x2d394e1e52\\x2dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "cryptsetup-pre.target dev-mapper-foo\\x2dtest1.device systemd-journald.socket system-systemd\\x2dcryptsetup.slice", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-mapper-foo\\x2dtest1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-394e1e52-cec6-40c8-bf74-b0e069960529", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-394e1e52-cec6-40c8-bf74-b0e069960529 /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-394e1e52-cec6-40c8-bf74-b0e069960529 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d394e1e52\\x2dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d394e1e52\\x2dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d394e1e52\\x2dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "dev-mapper-luks\\x2d394e1e52\\x2dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.device cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Tue 2025-08-19 14:33:11 EDT", "StateChangeTimestampMonotonic": "2708111602", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node11] => (item=systemd-cryptsetup@luk...dcec6\x2d40c8\x2dbf74\x2db0e069960529.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "name": "systemd-cryptsetup@luk...dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Tuesday 19 August 2025 14:34:58 -0400 (0:00:02.987) 0:18:46.693 ******** fatal: [managed-node11]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-394e1e52-cec6-40c8-bf74-b0e069960529' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Tuesday 19 August 2025 14:35:03 -0400 (0:00:05.266) 0:18:51.960 ******** fatal: [managed-node11]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'luks-394e1e52-cec6-40c8-bf74-b0e069960529' in safe mode due to encryption removal", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks1', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Tuesday 19 August 2025 14:35:04 -0400 (0:00:00.252) 0:18:52.213 ******** changed: [managed-node11] => (item=systemd-cryptsetup@luks\x2d394e1e52\x2dcec6\x2d40c8\x2dbf74\x2db0e069960529.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d394e1e52\\x2dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "name": "systemd-cryptsetup@luks\\x2d394e1e52\\x2dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d394e1e52\\x2dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d394e1e52\\x2dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d394e1e52\\x2dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d394e1e52\\x2dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "RequiredBy": "dev-mapper-luks\\x2d394e1e52\\x2dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.device cryptsetup.target", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Tue 2025-08-19 14:33:11 EDT", "StateChangeTimestampMonotonic": "2708111602", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node11] => (item=systemd-cryptsetup@luk...dcec6\x2d40c8\x2dbf74\x2db0e069960529.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "name": "systemd-cryptsetup@luk...dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Tuesday 19 August 2025 14:35:07 -0400 (0:00:03.636) 0:18:55.849 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Tuesday 19 August 2025 14:35:08 -0400 (0:00:00.365) 0:18:56.215 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Tuesday 19 August 2025 14:35:08 -0400 (0:00:00.337) 0:18:56.552 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Tuesday 19 August 2025 14:35:08 -0400 (0:00:00.187) 0:18:56.739 ******** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1755628474.8754086, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1755628474.8754086, "dev": 64769, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1755628474.8754086, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "830404678", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Tuesday 19 August 2025 14:35:10 -0400 (0:00:01.454) 0:18:58.194 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer - 3] ***************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:449 Tuesday 19 August 2025 14:35:10 -0400 (0:00:00.311) 0:18:58.505 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Tuesday 19 August 2025 14:35:11 -0400 (0:00:00.654) 0:18:59.160 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Tuesday 19 August 2025 14:35:11 -0400 (0:00:00.301) 0:18:59.474 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Tuesday 19 August 2025 14:35:11 -0400 (0:00:00.308) 0:18:59.782 ******** skipping: [managed-node11] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node11] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Tuesday 19 August 2025 14:35:12 -0400 (0:00:00.949) 0:19:00.731 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Tuesday 19 August 2025 14:35:12 -0400 (0:00:00.222) 0:19:00.954 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Tuesday 19 August 2025 14:35:12 -0400 (0:00:00.175) 0:19:01.130 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Tuesday 19 August 2025 14:35:13 -0400 (0:00:00.276) 0:19:01.406 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Tuesday 19 August 2025 14:35:13 -0400 (0:00:00.177) 0:19:01.584 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Tuesday 19 August 2025 14:35:13 -0400 (0:00:00.508) 0:19:02.092 ******** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Tuesday 19 August 2025 14:35:18 -0400 (0:00:04.714) 0:19:06.807 ******** ok: [managed-node11] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Tuesday 19 August 2025 14:35:18 -0400 (0:00:00.175) 0:19:06.983 ******** ok: [managed-node11] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Tuesday 19 August 2025 14:35:19 -0400 (0:00:00.186) 0:19:07.169 ******** ok: [managed-node11] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Tuesday 19 August 2025 14:35:24 -0400 (0:00:05.608) 0:19:12.777 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Tuesday 19 August 2025 14:35:24 -0400 (0:00:00.276) 0:19:13.054 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Tuesday 19 August 2025 14:35:25 -0400 (0:00:00.169) 0:19:13.223 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Tuesday 19 August 2025 14:35:25 -0400 (0:00:00.175) 0:19:13.398 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Tuesday 19 August 2025 14:35:25 -0400 (0:00:00.218) 0:19:13.616 ******** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Tuesday 19 August 2025 14:35:30 -0400 (0:00:04.611) 0:19:18.228 ******** ok: [managed-node11] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service": { "name": "systemd-cryptsetup@luk...dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2d394e1e52\\x2dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service": { "name": "systemd-cryptsetup@luks\\x2d394e1e52\\x2dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Tuesday 19 August 2025 14:35:32 -0400 (0:00:02.711) 0:19:20.939 ******** ok: [managed-node11] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d394e1e52\\x2dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "systemd-cryptsetup@luk...dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Tuesday 19 August 2025 14:35:33 -0400 (0:00:00.320) 0:19:21.259 ******** changed: [managed-node11] => (item=systemd-cryptsetup@luks\x2d394e1e52\x2dcec6\x2d40c8\x2dbf74\x2db0e069960529.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d394e1e52\\x2dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "name": "systemd-cryptsetup@luks\\x2d394e1e52\\x2dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "system-systemd\\x2dcryptsetup.slice systemd-journald.socket dev-mapper-foo\\x2dtest1.device cryptsetup-pre.target", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-mapper-foo\\x2dtest1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-394e1e52-cec6-40c8-bf74-b0e069960529", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-394e1e52-cec6-40c8-bf74-b0e069960529 /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-394e1e52-cec6-40c8-bf74-b0e069960529 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d394e1e52\\x2dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d394e1e52\\x2dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d394e1e52\\x2dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "dev-mapper-luks\\x2d394e1e52\\x2dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.device cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Tue 2025-08-19 14:33:11 EDT", "StateChangeTimestampMonotonic": "2708111602", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node11] => (item=systemd-cryptsetup@luk...dcec6\x2d40c8\x2dbf74\x2db0e069960529.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "name": "systemd-cryptsetup@luk...dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Tuesday 19 August 2025 14:35:35 -0400 (0:00:02.877) 0:19:24.136 ******** changed: [managed-node11] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-394e1e52-cec6-40c8-bf74-b0e069960529", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-394e1e52-cec6-40c8-bf74-b0e069960529", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-394e1e52-cec6-40c8-bf74-b0e069960529", "password": "-", "state": "absent" } ], "leaves": [ "/dev/mapper/foo-test1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-394e1e52-cec6-40c8-bf74-b0e069960529", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Tuesday 19 August 2025 14:35:41 -0400 (0:00:05.427) 0:19:29.564 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Tuesday 19 August 2025 14:35:41 -0400 (0:00:00.210) 0:19:29.774 ******** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1755628282.175327, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "2136b168aed65fb5d3269ecd9035b51807a5a1b4", "ctime": 1755628282.172327, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 364904586, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1755628282.172327, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "2238694571", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Tuesday 19 August 2025 14:35:42 -0400 (0:00:00.953) 0:19:30.727 ******** ok: [managed-node11] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Tuesday 19 August 2025 14:35:43 -0400 (0:00:01.396) 0:19:32.124 ******** changed: [managed-node11] => (item=systemd-cryptsetup@luks\x2d394e1e52\x2dcec6\x2d40c8\x2dbf74\x2db0e069960529.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d394e1e52\\x2dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "name": "systemd-cryptsetup@luks\\x2d394e1e52\\x2dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d394e1e52\\x2dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d394e1e52\\x2dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d394e1e52\\x2dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d394e1e52\\x2dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "RequiredBy": "dev-mapper-luks\\x2d394e1e52\\x2dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.device cryptsetup.target", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Tue 2025-08-19 14:33:11 EDT", "StateChangeTimestampMonotonic": "2708111602", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node11] => (item=systemd-cryptsetup@luk...dcec6\x2d40c8\x2dbf74\x2db0e069960529.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "name": "systemd-cryptsetup@luk...dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Tuesday 19 August 2025 14:35:46 -0400 (0:00:02.759) 0:19:34.883 ******** ok: [managed-node11] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-394e1e52-cec6-40c8-bf74-b0e069960529", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-394e1e52-cec6-40c8-bf74-b0e069960529", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-394e1e52-cec6-40c8-bf74-b0e069960529", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/mapper/foo-test1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-394e1e52-cec6-40c8-bf74-b0e069960529", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Tuesday 19 August 2025 14:35:46 -0400 (0:00:00.182) 0:19:35.066 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Tuesday 19 August 2025 14:35:47 -0400 (0:00:00.150) 0:19:35.216 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Tuesday 19 August 2025 14:35:47 -0400 (0:00:00.129) 0:19:35.346 ******** changed: [managed-node11] => (item={'src': '/dev/mapper/luks-394e1e52-cec6-40c8-bf74-b0e069960529', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-394e1e52-cec6-40c8-bf74-b0e069960529", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-394e1e52-cec6-40c8-bf74-b0e069960529" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Tuesday 19 August 2025 14:35:48 -0400 (0:00:01.236) 0:19:36.583 ******** ok: [managed-node11] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Tuesday 19 August 2025 14:35:50 -0400 (0:00:01.653) 0:19:38.236 ******** changed: [managed-node11] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Tuesday 19 August 2025 14:35:51 -0400 (0:00:01.431) 0:19:39.667 ******** skipping: [managed-node11] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Tuesday 19 August 2025 14:35:51 -0400 (0:00:00.382) 0:19:40.049 ******** ok: [managed-node11] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Tuesday 19 August 2025 14:35:53 -0400 (0:00:01.737) 0:19:41.786 ******** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1755628296.2114062, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "e7aae20622194b7b543ec40aabdd2b0d35f8d446", "ctime": 1755628288.542363, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 511705223, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1755628288.542363, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "2972772328", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Tuesday 19 August 2025 14:35:54 -0400 (0:00:01.164) 0:19:42.951 ******** changed: [managed-node11] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-394e1e52-cec6-40c8-bf74-b0e069960529', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-394e1e52-cec6-40c8-bf74-b0e069960529", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Tuesday 19 August 2025 14:35:56 -0400 (0:00:01.333) 0:19:44.284 ******** ok: [managed-node11] TASK [Verify role results - 9] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:465 Tuesday 19 August 2025 14:35:57 -0400 (0:00:01.386) 0:19:45.671 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node11 TASK [Print out pool information] ********************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Tuesday 19 August 2025 14:35:57 -0400 (0:00:00.259) 0:19:45.931 ******** ok: [managed-node11] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Tuesday 19 August 2025 14:35:57 -0400 (0:00:00.148) 0:19:46.080 ******** skipping: [managed-node11] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Tuesday 19 August 2025 14:35:58 -0400 (0:00:00.093) 0:19:46.173 ******** ok: [managed-node11] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "a3e03940-cf58-4ea1-bb23-fa26172b93c3" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "rZPaos-Ceok-3XVM-1kaP-jB95-fFYd-ITZHBi" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Tuesday 19 August 2025 14:35:59 -0400 (0:00:01.413) 0:19:47.587 ******** ok: [managed-node11] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002170", "end": "2025-08-19 14:36:00.644269", "rc": 0, "start": "2025-08-19 14:36:00.642099" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Tuesday 19 August 2025 14:36:00 -0400 (0:00:01.489) 0:19:49.077 ******** ok: [managed-node11] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002416", "end": "2025-08-19 14:36:02.156369", "failed_when_result": false, "rc": 0, "start": "2025-08-19 14:36:02.153953" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Tuesday 19 August 2025 14:36:02 -0400 (0:00:01.568) 0:19:50.645 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node11 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Tuesday 19 August 2025 14:36:02 -0400 (0:00:00.299) 0:19:50.944 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Tuesday 19 August 2025 14:36:03 -0400 (0:00:00.213) 0:19:51.158 ******** ok: [managed-node11] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.024827", "end": "2025-08-19 14:36:04.172101", "rc": 0, "start": "2025-08-19 14:36:04.147274" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Tuesday 19 August 2025 14:36:04 -0400 (0:00:01.287) 0:19:52.445 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Tuesday 19 August 2025 14:36:04 -0400 (0:00:00.266) 0:19:52.712 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node11 TASK [Set test variables] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Tuesday 19 August 2025 14:36:04 -0400 (0:00:00.414) 0:19:53.126 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Tuesday 19 August 2025 14:36:05 -0400 (0:00:00.258) 0:19:53.385 ******** ok: [managed-node11] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Tuesday 19 August 2025 14:36:06 -0400 (0:00:01.553) 0:19:54.939 ******** ok: [managed-node11] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Tuesday 19 August 2025 14:36:07 -0400 (0:00:00.241) 0:19:55.180 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Tuesday 19 August 2025 14:36:07 -0400 (0:00:00.210) 0:19:55.390 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Tuesday 19 August 2025 14:36:07 -0400 (0:00:00.169) 0:19:55.560 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Tuesday 19 August 2025 14:36:07 -0400 (0:00:00.208) 0:19:55.768 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Tuesday 19 August 2025 14:36:07 -0400 (0:00:00.195) 0:19:55.963 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Tuesday 19 August 2025 14:36:07 -0400 (0:00:00.144) 0:19:56.108 ******** ok: [managed-node11] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Tuesday 19 August 2025 14:36:08 -0400 (0:00:00.224) 0:19:56.332 ******** ok: [managed-node11] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.43.133 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Tuesday 19 August 2025 14:36:09 -0400 (0:00:01.605) 0:19:57.938 ******** skipping: [managed-node11] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Tuesday 19 August 2025 14:36:10 -0400 (0:00:00.269) 0:19:58.208 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node11 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Tuesday 19 August 2025 14:36:10 -0400 (0:00:00.832) 0:19:59.040 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Tuesday 19 August 2025 14:36:11 -0400 (0:00:00.262) 0:19:59.303 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Tuesday 19 August 2025 14:36:11 -0400 (0:00:00.161) 0:19:59.464 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Tuesday 19 August 2025 14:36:11 -0400 (0:00:00.174) 0:19:59.639 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Tuesday 19 August 2025 14:36:11 -0400 (0:00:00.225) 0:19:59.865 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Tuesday 19 August 2025 14:36:11 -0400 (0:00:00.131) 0:19:59.996 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Tuesday 19 August 2025 14:36:12 -0400 (0:00:00.221) 0:20:00.217 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Tuesday 19 August 2025 14:36:12 -0400 (0:00:00.255) 0:20:00.473 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Tuesday 19 August 2025 14:36:12 -0400 (0:00:00.255) 0:20:00.728 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Tuesday 19 August 2025 14:36:12 -0400 (0:00:00.222) 0:20:00.951 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Tuesday 19 August 2025 14:36:13 -0400 (0:00:00.285) 0:20:01.237 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Tuesday 19 August 2025 14:36:13 -0400 (0:00:00.176) 0:20:01.413 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node11 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Tuesday 19 August 2025 14:36:13 -0400 (0:00:00.307) 0:20:01.720 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node11 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Tuesday 19 August 2025 14:36:13 -0400 (0:00:00.270) 0:20:01.990 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Tuesday 19 August 2025 14:36:13 -0400 (0:00:00.136) 0:20:02.127 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Tuesday 19 August 2025 14:36:14 -0400 (0:00:00.118) 0:20:02.246 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Tuesday 19 August 2025 14:36:14 -0400 (0:00:00.317) 0:20:02.564 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Tuesday 19 August 2025 14:36:14 -0400 (0:00:00.174) 0:20:02.738 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Tuesday 19 August 2025 14:36:14 -0400 (0:00:00.190) 0:20:02.929 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Tuesday 19 August 2025 14:36:14 -0400 (0:00:00.172) 0:20:03.102 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Tuesday 19 August 2025 14:36:15 -0400 (0:00:00.186) 0:20:03.288 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node11 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Tuesday 19 August 2025 14:36:15 -0400 (0:00:00.333) 0:20:03.621 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node11 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Tuesday 19 August 2025 14:36:16 -0400 (0:00:00.523) 0:20:04.145 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Tuesday 19 August 2025 14:36:16 -0400 (0:00:00.246) 0:20:04.392 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Tuesday 19 August 2025 14:36:16 -0400 (0:00:00.196) 0:20:04.588 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Tuesday 19 August 2025 14:36:16 -0400 (0:00:00.222) 0:20:04.810 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Tuesday 19 August 2025 14:36:16 -0400 (0:00:00.300) 0:20:05.111 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node11 TASK [Set test variables] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Tuesday 19 August 2025 14:36:17 -0400 (0:00:00.368) 0:20:05.479 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Tuesday 19 August 2025 14:36:17 -0400 (0:00:00.320) 0:20:05.800 ******** skipping: [managed-node11] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Tuesday 19 August 2025 14:36:17 -0400 (0:00:00.267) 0:20:06.067 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node11 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Tuesday 19 August 2025 14:36:18 -0400 (0:00:00.371) 0:20:06.439 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Tuesday 19 August 2025 14:36:18 -0400 (0:00:00.263) 0:20:06.703 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Tuesday 19 August 2025 14:36:18 -0400 (0:00:00.209) 0:20:06.912 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Tuesday 19 August 2025 14:36:18 -0400 (0:00:00.225) 0:20:07.137 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Tuesday 19 August 2025 14:36:19 -0400 (0:00:00.168) 0:20:07.306 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Tuesday 19 August 2025 14:36:19 -0400 (0:00:00.292) 0:20:07.599 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Tuesday 19 August 2025 14:36:19 -0400 (0:00:00.269) 0:20:07.868 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Tuesday 19 August 2025 14:36:19 -0400 (0:00:00.254) 0:20:08.122 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node11 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Tuesday 19 August 2025 14:36:20 -0400 (0:00:00.563) 0:20:08.685 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node11 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Tuesday 19 August 2025 14:36:21 -0400 (0:00:00.894) 0:20:09.580 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Tuesday 19 August 2025 14:36:21 -0400 (0:00:00.236) 0:20:09.816 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Tuesday 19 August 2025 14:36:21 -0400 (0:00:00.199) 0:20:10.016 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Tuesday 19 August 2025 14:36:22 -0400 (0:00:00.317) 0:20:10.334 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off - 2] *********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Tuesday 19 August 2025 14:36:22 -0400 (0:00:00.317) 0:20:10.652 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on - 2] ************************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Tuesday 19 August 2025 14:36:22 -0400 (0:00:00.184) 0:20:10.836 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Tuesday 19 August 2025 14:36:22 -0400 (0:00:00.299) 0:20:11.136 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Tuesday 19 August 2025 14:36:23 -0400 (0:00:00.225) 0:20:11.361 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node11 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Tuesday 19 August 2025 14:36:23 -0400 (0:00:00.488) 0:20:11.849 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Tuesday 19 August 2025 14:36:23 -0400 (0:00:00.143) 0:20:11.992 ******** skipping: [managed-node11] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Tuesday 19 August 2025 14:36:24 -0400 (0:00:00.166) 0:20:12.159 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Tuesday 19 August 2025 14:36:24 -0400 (0:00:00.208) 0:20:12.367 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Tuesday 19 August 2025 14:36:24 -0400 (0:00:00.274) 0:20:12.642 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Tuesday 19 August 2025 14:36:24 -0400 (0:00:00.198) 0:20:12.840 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Tuesday 19 August 2025 14:36:24 -0400 (0:00:00.193) 0:20:13.034 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Tuesday 19 August 2025 14:36:25 -0400 (0:00:00.191) 0:20:13.225 ******** ok: [managed-node11] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Tuesday 19 August 2025 14:36:25 -0400 (0:00:00.238) 0:20:13.463 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node11 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Tuesday 19 August 2025 14:36:25 -0400 (0:00:00.258) 0:20:13.721 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Tuesday 19 August 2025 14:36:25 -0400 (0:00:00.248) 0:20:13.969 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node11 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Tuesday 19 August 2025 14:36:26 -0400 (0:00:01.122) 0:20:15.092 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Tuesday 19 August 2025 14:36:27 -0400 (0:00:00.357) 0:20:15.450 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Tuesday 19 August 2025 14:36:27 -0400 (0:00:00.217) 0:20:15.667 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Tuesday 19 August 2025 14:36:27 -0400 (0:00:00.404) 0:20:16.072 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Tuesday 19 August 2025 14:36:28 -0400 (0:00:00.248) 0:20:16.320 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Tuesday 19 August 2025 14:36:28 -0400 (0:00:00.184) 0:20:16.505 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Tuesday 19 August 2025 14:36:28 -0400 (0:00:00.298) 0:20:16.803 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Tuesday 19 August 2025 14:36:28 -0400 (0:00:00.329) 0:20:17.133 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Tuesday 19 August 2025 14:36:29 -0400 (0:00:00.194) 0:20:17.327 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Tuesday 19 August 2025 14:36:29 -0400 (0:00:00.236) 0:20:17.563 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Tuesday 19 August 2025 14:36:29 -0400 (0:00:00.198) 0:20:17.761 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Tuesday 19 August 2025 14:36:29 -0400 (0:00:00.099) 0:20:17.861 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Tuesday 19 August 2025 14:36:30 -0400 (0:00:00.443) 0:20:18.305 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Tuesday 19 August 2025 14:36:30 -0400 (0:00:00.311) 0:20:18.616 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Tuesday 19 August 2025 14:36:30 -0400 (0:00:00.259) 0:20:18.876 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Tuesday 19 August 2025 14:36:31 -0400 (0:00:00.272) 0:20:19.148 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Tuesday 19 August 2025 14:36:31 -0400 (0:00:00.320) 0:20:19.468 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Tuesday 19 August 2025 14:36:31 -0400 (0:00:00.254) 0:20:19.723 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Tuesday 19 August 2025 14:36:31 -0400 (0:00:00.303) 0:20:20.026 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Tuesday 19 August 2025 14:36:32 -0400 (0:00:00.428) 0:20:20.454 ******** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1755628541.2797813, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1755628541.2797813, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 266139, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1755628541.2797813, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Tuesday 19 August 2025 14:36:33 -0400 (0:00:01.466) 0:20:21.920 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Tuesday 19 August 2025 14:36:34 -0400 (0:00:00.383) 0:20:22.303 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Tuesday 19 August 2025 14:36:34 -0400 (0:00:00.334) 0:20:22.638 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Tuesday 19 August 2025 14:36:34 -0400 (0:00:00.319) 0:20:22.958 ******** ok: [managed-node11] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Tuesday 19 August 2025 14:36:35 -0400 (0:00:00.771) 0:20:23.730 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Tuesday 19 August 2025 14:36:35 -0400 (0:00:00.308) 0:20:24.038 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Tuesday 19 August 2025 14:36:36 -0400 (0:00:00.341) 0:20:24.380 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Tuesday 19 August 2025 14:36:36 -0400 (0:00:00.281) 0:20:24.661 ******** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Tuesday 19 August 2025 14:36:41 -0400 (0:00:04.729) 0:20:29.391 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Tuesday 19 August 2025 14:36:41 -0400 (0:00:00.212) 0:20:29.604 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Tuesday 19 August 2025 14:36:41 -0400 (0:00:00.197) 0:20:29.801 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Tuesday 19 August 2025 14:36:41 -0400 (0:00:00.205) 0:20:30.006 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Tuesday 19 August 2025 14:36:42 -0400 (0:00:00.292) 0:20:30.299 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Tuesday 19 August 2025 14:36:42 -0400 (0:00:00.216) 0:20:30.516 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Tuesday 19 August 2025 14:36:42 -0400 (0:00:00.197) 0:20:30.713 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Tuesday 19 August 2025 14:36:42 -0400 (0:00:00.185) 0:20:30.898 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Tuesday 19 August 2025 14:36:42 -0400 (0:00:00.195) 0:20:31.094 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Tuesday 19 August 2025 14:36:43 -0400 (0:00:00.325) 0:20:31.420 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Tuesday 19 August 2025 14:36:43 -0400 (0:00:00.164) 0:20:31.584 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Tuesday 19 August 2025 14:36:43 -0400 (0:00:00.197) 0:20:31.781 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Tuesday 19 August 2025 14:36:43 -0400 (0:00:00.179) 0:20:31.961 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Tuesday 19 August 2025 14:36:43 -0400 (0:00:00.147) 0:20:32.108 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Tuesday 19 August 2025 14:36:44 -0400 (0:00:00.173) 0:20:32.282 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Tuesday 19 August 2025 14:36:44 -0400 (0:00:00.181) 0:20:32.463 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Tuesday 19 August 2025 14:36:44 -0400 (0:00:00.161) 0:20:32.625 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Tuesday 19 August 2025 14:36:44 -0400 (0:00:00.206) 0:20:32.831 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Tuesday 19 August 2025 14:36:44 -0400 (0:00:00.192) 0:20:33.024 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Tuesday 19 August 2025 14:36:45 -0400 (0:00:00.208) 0:20:33.232 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Tuesday 19 August 2025 14:36:45 -0400 (0:00:00.160) 0:20:33.392 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Tuesday 19 August 2025 14:36:45 -0400 (0:00:00.178) 0:20:33.571 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Tuesday 19 August 2025 14:36:45 -0400 (0:00:00.182) 0:20:33.753 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Tuesday 19 August 2025 14:36:45 -0400 (0:00:00.097) 0:20:33.851 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Tuesday 19 August 2025 14:36:45 -0400 (0:00:00.105) 0:20:33.957 ******** ok: [managed-node11] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Tuesday 19 August 2025 14:36:47 -0400 (0:00:01.292) 0:20:35.249 ******** ok: [managed-node11] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Tuesday 19 August 2025 14:36:48 -0400 (0:00:01.333) 0:20:36.583 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Tuesday 19 August 2025 14:36:48 -0400 (0:00:00.353) 0:20:36.937 ******** ok: [managed-node11] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Tuesday 19 August 2025 14:36:49 -0400 (0:00:00.271) 0:20:37.208 ******** ok: [managed-node11] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Tuesday 19 August 2025 14:36:50 -0400 (0:00:01.475) 0:20:38.683 ******** skipping: [managed-node11] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Tuesday 19 August 2025 14:36:50 -0400 (0:00:00.277) 0:20:38.961 ******** skipping: [managed-node11] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Tuesday 19 August 2025 14:36:51 -0400 (0:00:00.251) 0:20:39.212 ******** skipping: [managed-node11] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Tuesday 19 August 2025 14:36:51 -0400 (0:00:00.293) 0:20:39.506 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Tuesday 19 August 2025 14:36:51 -0400 (0:00:00.235) 0:20:39.741 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Tuesday 19 August 2025 14:36:51 -0400 (0:00:00.296) 0:20:40.038 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Tuesday 19 August 2025 14:36:52 -0400 (0:00:00.275) 0:20:40.314 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Tuesday 19 August 2025 14:36:52 -0400 (0:00:00.267) 0:20:40.582 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Tuesday 19 August 2025 14:36:52 -0400 (0:00:00.308) 0:20:40.891 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Tuesday 19 August 2025 14:36:53 -0400 (0:00:00.270) 0:20:41.161 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Tuesday 19 August 2025 14:36:53 -0400 (0:00:00.270) 0:20:41.432 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Tuesday 19 August 2025 14:36:53 -0400 (0:00:00.239) 0:20:41.672 ******** skipping: [managed-node11] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Tuesday 19 August 2025 14:36:53 -0400 (0:00:00.266) 0:20:41.938 ******** skipping: [managed-node11] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Tuesday 19 August 2025 14:36:54 -0400 (0:00:00.353) 0:20:42.291 ******** skipping: [managed-node11] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Tuesday 19 August 2025 14:36:54 -0400 (0:00:00.188) 0:20:42.479 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Tuesday 19 August 2025 14:36:54 -0400 (0:00:00.280) 0:20:42.760 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Tuesday 19 August 2025 14:36:54 -0400 (0:00:00.196) 0:20:42.956 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Tuesday 19 August 2025 14:36:55 -0400 (0:00:00.247) 0:20:43.204 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Tuesday 19 August 2025 14:36:55 -0400 (0:00:00.261) 0:20:43.465 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Tuesday 19 August 2025 14:36:55 -0400 (0:00:00.272) 0:20:43.738 ******** ok: [managed-node11] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Tuesday 19 August 2025 14:36:55 -0400 (0:00:00.231) 0:20:43.970 ******** ok: [managed-node11] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Tuesday 19 August 2025 14:36:56 -0400 (0:00:00.189) 0:20:44.159 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Tuesday 19 August 2025 14:36:56 -0400 (0:00:00.213) 0:20:44.372 ******** ok: [managed-node11] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.023186", "end": "2025-08-19 14:36:57.228148", "rc": 0, "start": "2025-08-19 14:36:57.204962" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Tuesday 19 August 2025 14:36:57 -0400 (0:00:01.304) 0:20:45.677 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Tuesday 19 August 2025 14:36:57 -0400 (0:00:00.233) 0:20:45.911 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Tuesday 19 August 2025 14:36:58 -0400 (0:00:00.311) 0:20:46.222 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Tuesday 19 August 2025 14:36:58 -0400 (0:00:00.061) 0:20:46.284 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Tuesday 19 August 2025 14:36:58 -0400 (0:00:00.187) 0:20:46.472 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Tuesday 19 August 2025 14:36:58 -0400 (0:00:00.308) 0:20:46.780 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Tuesday 19 August 2025 14:36:58 -0400 (0:00:00.155) 0:20:46.935 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Tuesday 19 August 2025 14:36:58 -0400 (0:00:00.134) 0:20:47.070 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Tuesday 19 August 2025 14:36:59 -0400 (0:00:00.133) 0:20:47.204 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Tuesday 19 August 2025 14:36:59 -0400 (0:00:00.114) 0:20:47.318 ******** changed: [managed-node11] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 6] ****************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:471 Tuesday 19 August 2025 14:37:00 -0400 (0:00:01.257) 0:20:48.576 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node11 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Tuesday 19 August 2025 14:37:01 -0400 (0:00:00.609) 0:20:49.186 ******** ok: [managed-node11] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Tuesday 19 August 2025 14:37:01 -0400 (0:00:00.233) 0:20:49.419 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Tuesday 19 August 2025 14:37:02 -0400 (0:00:00.809) 0:20:50.228 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Tuesday 19 August 2025 14:37:02 -0400 (0:00:00.368) 0:20:50.597 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Tuesday 19 August 2025 14:37:02 -0400 (0:00:00.331) 0:20:50.928 ******** skipping: [managed-node11] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node11] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Tuesday 19 August 2025 14:37:03 -0400 (0:00:00.519) 0:20:51.447 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Tuesday 19 August 2025 14:37:03 -0400 (0:00:00.181) 0:20:51.628 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Tuesday 19 August 2025 14:37:03 -0400 (0:00:00.282) 0:20:51.911 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Tuesday 19 August 2025 14:37:03 -0400 (0:00:00.228) 0:20:52.139 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Tuesday 19 August 2025 14:37:04 -0400 (0:00:00.290) 0:20:52.430 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Tuesday 19 August 2025 14:37:04 -0400 (0:00:00.542) 0:20:52.972 ******** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Tuesday 19 August 2025 14:37:09 -0400 (0:00:04.382) 0:20:57.354 ******** ok: [managed-node11] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Tuesday 19 August 2025 14:37:09 -0400 (0:00:00.217) 0:20:57.572 ******** ok: [managed-node11] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Tuesday 19 August 2025 14:37:09 -0400 (0:00:00.191) 0:20:57.763 ******** ok: [managed-node11] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Tuesday 19 August 2025 14:37:14 -0400 (0:00:05.254) 0:21:03.018 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Tuesday 19 August 2025 14:37:15 -0400 (0:00:00.379) 0:21:03.397 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Tuesday 19 August 2025 14:37:15 -0400 (0:00:00.164) 0:21:03.561 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Tuesday 19 August 2025 14:37:15 -0400 (0:00:00.170) 0:21:03.731 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Tuesday 19 August 2025 14:37:15 -0400 (0:00:00.220) 0:21:03.952 ******** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Tuesday 19 August 2025 14:37:19 -0400 (0:00:04.160) 0:21:08.112 ******** ok: [managed-node11] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service": { "name": "systemd-cryptsetup@luk...dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2d394e1e52\\x2dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service": { "name": "systemd-cryptsetup@luks\\x2d394e1e52\\x2dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Tuesday 19 August 2025 14:37:22 -0400 (0:00:02.794) 0:21:10.907 ******** ok: [managed-node11] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d394e1e52\\x2dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "systemd-cryptsetup@luk...dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Tuesday 19 August 2025 14:37:23 -0400 (0:00:00.260) 0:21:11.167 ******** changed: [managed-node11] => (item=systemd-cryptsetup@luks\x2d394e1e52\x2dcec6\x2d40c8\x2dbf74\x2db0e069960529.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d394e1e52\\x2dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "name": "systemd-cryptsetup@luks\\x2d394e1e52\\x2dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "cryptsetup-pre.target systemd-journald.socket dev-mapper-foo\\x2dtest1.device system-systemd\\x2dcryptsetup.slice", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-mapper-foo\\x2dtest1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-394e1e52-cec6-40c8-bf74-b0e069960529", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-394e1e52-cec6-40c8-bf74-b0e069960529 /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-394e1e52-cec6-40c8-bf74-b0e069960529 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d394e1e52\\x2dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d394e1e52\\x2dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d394e1e52\\x2dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Tue 2025-08-19 14:33:11 EDT", "StateChangeTimestampMonotonic": "2708111602", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node11] => (item=systemd-cryptsetup@luk...dcec6\x2d40c8\x2dbf74\x2db0e069960529.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "name": "systemd-cryptsetup@luk...dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Tuesday 19 August 2025 14:37:25 -0400 (0:00:02.814) 0:21:13.981 ******** fatal: [managed-node11]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'test1' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Tuesday 19 August 2025 14:37:30 -0400 (0:00:05.142) 0:21:19.124 ******** fatal: [managed-node11]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'test1' in safe mode due to adding encryption", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Tuesday 19 August 2025 14:37:31 -0400 (0:00:00.234) 0:21:19.358 ******** changed: [managed-node11] => (item=systemd-cryptsetup@luks\x2d394e1e52\x2dcec6\x2d40c8\x2dbf74\x2db0e069960529.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d394e1e52\\x2dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "name": "systemd-cryptsetup@luks\\x2d394e1e52\\x2dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d394e1e52\\x2dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d394e1e52\\x2dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d394e1e52\\x2dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d394e1e52\\x2dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node11] => (item=systemd-cryptsetup@luk...dcec6\x2d40c8\x2dbf74\x2db0e069960529.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "name": "systemd-cryptsetup@luk...dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...dcec6\\x2d40c8\\x2dbf74\\x2db0e069960529.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Tuesday 19 August 2025 14:37:34 -0400 (0:00:03.018) 0:21:22.376 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Tuesday 19 August 2025 14:37:34 -0400 (0:00:00.230) 0:21:22.607 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Tuesday 19 August 2025 14:37:34 -0400 (0:00:00.403) 0:21:23.011 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Tuesday 19 August 2025 14:37:35 -0400 (0:00:00.235) 0:21:23.246 ******** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1755628620.1152334, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1755628620.1152334, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1755628620.1152334, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "3821451976", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Tuesday 19 August 2025 14:37:36 -0400 (0:00:01.281) 0:21:24.528 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Add encryption to the volume - 3] **************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:494 Tuesday 19 August 2025 14:37:36 -0400 (0:00:00.265) 0:21:24.793 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Tuesday 19 August 2025 14:37:37 -0400 (0:00:00.916) 0:21:25.709 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Tuesday 19 August 2025 14:37:37 -0400 (0:00:00.401) 0:21:26.111 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Tuesday 19 August 2025 14:37:38 -0400 (0:00:00.285) 0:21:26.396 ******** skipping: [managed-node11] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node11] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Tuesday 19 August 2025 14:37:38 -0400 (0:00:00.488) 0:21:26.885 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Tuesday 19 August 2025 14:37:38 -0400 (0:00:00.231) 0:21:27.116 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Tuesday 19 August 2025 14:37:39 -0400 (0:00:00.197) 0:21:27.313 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Tuesday 19 August 2025 14:37:39 -0400 (0:00:00.251) 0:21:27.565 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Tuesday 19 August 2025 14:37:39 -0400 (0:00:00.132) 0:21:27.698 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Tuesday 19 August 2025 14:37:40 -0400 (0:00:00.546) 0:21:28.245 ******** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Tuesday 19 August 2025 14:37:44 -0400 (0:00:04.339) 0:21:32.584 ******** ok: [managed-node11] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Tuesday 19 August 2025 14:37:44 -0400 (0:00:00.241) 0:21:32.826 ******** ok: [managed-node11] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Tuesday 19 August 2025 14:37:44 -0400 (0:00:00.194) 0:21:33.021 ******** ok: [managed-node11] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Tuesday 19 August 2025 14:37:50 -0400 (0:00:05.530) 0:21:38.551 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Tuesday 19 August 2025 14:37:50 -0400 (0:00:00.402) 0:21:38.954 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Tuesday 19 August 2025 14:37:51 -0400 (0:00:00.266) 0:21:39.221 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Tuesday 19 August 2025 14:37:51 -0400 (0:00:00.223) 0:21:39.444 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Tuesday 19 August 2025 14:37:51 -0400 (0:00:00.152) 0:21:39.596 ******** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Tuesday 19 August 2025 14:37:55 -0400 (0:00:04.524) 0:21:44.121 ******** ok: [managed-node11] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Tuesday 19 August 2025 14:37:58 -0400 (0:00:02.895) 0:21:47.016 ******** ok: [managed-node11] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Tuesday 19 August 2025 14:37:59 -0400 (0:00:00.451) 0:21:47.467 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Tuesday 19 August 2025 14:37:59 -0400 (0:00:00.238) 0:21:47.706 ******** changed: [managed-node11] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-6c5e45cf-09f6-4725-bb5d-cf60b1ba6b1e", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-6c5e45cf-09f6-4725-bb5d-cf60b1ba6b1e", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-6c5e45cf-09f6-4725-bb5d-cf60b1ba6b1e", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-6c5e45cf-09f6-4725-bb5d-cf60b1ba6b1e" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-6c5e45cf-09f6-4725-bb5d-cf60b1ba6b1e", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-6c5e45cf-09f6-4725-bb5d-cf60b1ba6b1e", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-6c5e45cf-09f6-4725-bb5d-cf60b1ba6b1e", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Tuesday 19 August 2025 14:38:13 -0400 (0:00:13.727) 0:22:01.434 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Tuesday 19 August 2025 14:38:13 -0400 (0:00:00.219) 0:22:01.654 ******** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1755628551.192837, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "a1522684f5b6a445a50f2611a4e0757a4aec1cf1", "ctime": 1755628551.189837, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 364904586, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1755628551.189837, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1393, "uid": 0, "version": "2238694571", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Tuesday 19 August 2025 14:38:14 -0400 (0:00:01.477) 0:22:03.131 ******** ok: [managed-node11] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Tuesday 19 August 2025 14:38:16 -0400 (0:00:01.625) 0:22:04.757 ******** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Tuesday 19 August 2025 14:38:17 -0400 (0:00:00.690) 0:22:05.447 ******** ok: [managed-node11] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-6c5e45cf-09f6-4725-bb5d-cf60b1ba6b1e", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-6c5e45cf-09f6-4725-bb5d-cf60b1ba6b1e", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-6c5e45cf-09f6-4725-bb5d-cf60b1ba6b1e", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-6c5e45cf-09f6-4725-bb5d-cf60b1ba6b1e" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-6c5e45cf-09f6-4725-bb5d-cf60b1ba6b1e", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-6c5e45cf-09f6-4725-bb5d-cf60b1ba6b1e", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-6c5e45cf-09f6-4725-bb5d-cf60b1ba6b1e", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Tuesday 19 August 2025 14:38:17 -0400 (0:00:00.117) 0:22:05.564 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-6c5e45cf-09f6-4725-bb5d-cf60b1ba6b1e", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-6c5e45cf-09f6-4725-bb5d-cf60b1ba6b1e", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Tuesday 19 August 2025 14:38:17 -0400 (0:00:00.277) 0:22:05.842 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Tuesday 19 August 2025 14:38:17 -0400 (0:00:00.236) 0:22:06.079 ******** changed: [managed-node11] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Tuesday 19 August 2025 14:38:19 -0400 (0:00:01.442) 0:22:07.522 ******** ok: [managed-node11] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Tuesday 19 August 2025 14:38:21 -0400 (0:00:01.723) 0:22:09.245 ******** changed: [managed-node11] => (item={'src': '/dev/mapper/luks-6c5e45cf-09f6-4725-bb5d-cf60b1ba6b1e', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-6c5e45cf-09f6-4725-bb5d-cf60b1ba6b1e", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-6c5e45cf-09f6-4725-bb5d-cf60b1ba6b1e" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Tuesday 19 August 2025 14:38:22 -0400 (0:00:01.837) 0:22:11.082 ******** skipping: [managed-node11] => (item={'src': '/dev/mapper/luks-6c5e45cf-09f6-4725-bb5d-cf60b1ba6b1e', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-6c5e45cf-09f6-4725-bb5d-cf60b1ba6b1e", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Tuesday 19 August 2025 14:38:23 -0400 (0:00:00.349) 0:22:11.432 ******** ok: [managed-node11] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Tuesday 19 August 2025 14:38:24 -0400 (0:00:01.670) 0:22:13.103 ******** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1755628562.1548996, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1755628555.9038637, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 285212868, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1755628555.9018636, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "263789199", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Tuesday 19 August 2025 14:38:26 -0400 (0:00:01.398) 0:22:14.501 ******** changed: [managed-node11] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-6c5e45cf-09f6-4725-bb5d-cf60b1ba6b1e', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-6c5e45cf-09f6-4725-bb5d-cf60b1ba6b1e", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Tuesday 19 August 2025 14:38:27 -0400 (0:00:01.593) 0:22:16.094 ******** ok: [managed-node11] TASK [Verify role results - 10] ************************************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:510 Tuesday 19 August 2025 14:38:29 -0400 (0:00:01.805) 0:22:17.900 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node11 TASK [Print out pool information] ********************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Tuesday 19 August 2025 14:38:30 -0400 (0:00:00.469) 0:22:18.370 ******** ok: [managed-node11] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-6c5e45cf-09f6-4725-bb5d-cf60b1ba6b1e", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-6c5e45cf-09f6-4725-bb5d-cf60b1ba6b1e", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Tuesday 19 August 2025 14:38:30 -0400 (0:00:00.220) 0:22:18.590 ******** skipping: [managed-node11] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Tuesday 19 August 2025 14:38:30 -0400 (0:00:00.202) 0:22:18.792 ******** ok: [managed-node11] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "6c5e45cf-09f6-4725-bb5d-cf60b1ba6b1e" }, "/dev/mapper/luks-6c5e45cf-09f6-4725-bb5d-cf60b1ba6b1e": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-6c5e45cf-09f6-4725-bb5d-cf60b1ba6b1e", "size": "4G", "type": "crypt", "uuid": "fa7cc22b-fd4a-4777-93d6-e9e5ca655ca2" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "rZPaos-Ceok-3XVM-1kaP-jB95-fFYd-ITZHBi" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Tuesday 19 August 2025 14:38:31 -0400 (0:00:01.232) 0:22:20.025 ******** ok: [managed-node11] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002158", "end": "2025-08-19 14:38:32.966145", "rc": 0, "start": "2025-08-19 14:38:32.963987" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-6c5e45cf-09f6-4725-bb5d-cf60b1ba6b1e /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Tuesday 19 August 2025 14:38:33 -0400 (0:00:01.370) 0:22:21.396 ******** ok: [managed-node11] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002940", "end": "2025-08-19 14:38:34.137975", "failed_when_result": false, "rc": 0, "start": "2025-08-19 14:38:34.135035" } STDOUT: luks-6c5e45cf-09f6-4725-bb5d-cf60b1ba6b1e /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Tuesday 19 August 2025 14:38:34 -0400 (0:00:01.151) 0:22:22.547 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node11 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Tuesday 19 August 2025 14:38:34 -0400 (0:00:00.364) 0:22:22.912 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Tuesday 19 August 2025 14:38:34 -0400 (0:00:00.178) 0:22:23.090 ******** ok: [managed-node11] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.022116", "end": "2025-08-19 14:38:36.329605", "rc": 0, "start": "2025-08-19 14:38:36.307489" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Tuesday 19 August 2025 14:38:36 -0400 (0:00:01.679) 0:22:24.769 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Tuesday 19 August 2025 14:38:37 -0400 (0:00:00.442) 0:22:25.212 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node11 TASK [Set test variables] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Tuesday 19 August 2025 14:38:37 -0400 (0:00:00.517) 0:22:25.729 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Tuesday 19 August 2025 14:38:37 -0400 (0:00:00.245) 0:22:25.975 ******** ok: [managed-node11] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Tuesday 19 August 2025 14:38:38 -0400 (0:00:01.103) 0:22:27.079 ******** ok: [managed-node11] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Tuesday 19 August 2025 14:38:39 -0400 (0:00:00.154) 0:22:27.233 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Tuesday 19 August 2025 14:38:39 -0400 (0:00:00.171) 0:22:27.404 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Tuesday 19 August 2025 14:38:39 -0400 (0:00:00.217) 0:22:27.621 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Tuesday 19 August 2025 14:38:39 -0400 (0:00:00.248) 0:22:27.870 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Tuesday 19 August 2025 14:38:40 -0400 (0:00:00.326) 0:22:28.196 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Tuesday 19 August 2025 14:38:40 -0400 (0:00:00.144) 0:22:28.341 ******** ok: [managed-node11] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Tuesday 19 August 2025 14:38:40 -0400 (0:00:00.235) 0:22:28.576 ******** ok: [managed-node11] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.43.133 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Tuesday 19 August 2025 14:38:42 -0400 (0:00:01.725) 0:22:30.302 ******** skipping: [managed-node11] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Tuesday 19 August 2025 14:38:42 -0400 (0:00:00.239) 0:22:30.541 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node11 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Tuesday 19 August 2025 14:38:42 -0400 (0:00:00.422) 0:22:30.964 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Tuesday 19 August 2025 14:38:43 -0400 (0:00:00.252) 0:22:31.216 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Tuesday 19 August 2025 14:38:43 -0400 (0:00:00.225) 0:22:31.442 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Tuesday 19 August 2025 14:38:43 -0400 (0:00:00.246) 0:22:31.689 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Tuesday 19 August 2025 14:38:43 -0400 (0:00:00.241) 0:22:31.931 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Tuesday 19 August 2025 14:38:43 -0400 (0:00:00.161) 0:22:32.092 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Tuesday 19 August 2025 14:38:44 -0400 (0:00:00.111) 0:22:32.204 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Tuesday 19 August 2025 14:38:44 -0400 (0:00:00.153) 0:22:32.357 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Tuesday 19 August 2025 14:38:44 -0400 (0:00:00.158) 0:22:32.515 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Tuesday 19 August 2025 14:38:44 -0400 (0:00:00.180) 0:22:32.696 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Tuesday 19 August 2025 14:38:44 -0400 (0:00:00.074) 0:22:32.770 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Tuesday 19 August 2025 14:38:44 -0400 (0:00:00.126) 0:22:32.896 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node11 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Tuesday 19 August 2025 14:38:45 -0400 (0:00:00.310) 0:22:33.207 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node11 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Tuesday 19 August 2025 14:38:45 -0400 (0:00:00.509) 0:22:33.716 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Tuesday 19 August 2025 14:38:45 -0400 (0:00:00.207) 0:22:33.923 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Tuesday 19 August 2025 14:38:46 -0400 (0:00:00.329) 0:22:34.253 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Tuesday 19 August 2025 14:38:46 -0400 (0:00:00.328) 0:22:34.581 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Tuesday 19 August 2025 14:38:46 -0400 (0:00:00.363) 0:22:34.944 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Tuesday 19 August 2025 14:38:46 -0400 (0:00:00.195) 0:22:35.139 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Tuesday 19 August 2025 14:38:47 -0400 (0:00:00.225) 0:22:35.365 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Tuesday 19 August 2025 14:38:47 -0400 (0:00:00.206) 0:22:35.572 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node11 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Tuesday 19 August 2025 14:38:47 -0400 (0:00:00.367) 0:22:35.939 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node11 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Tuesday 19 August 2025 14:38:48 -0400 (0:00:00.399) 0:22:36.339 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Tuesday 19 August 2025 14:38:48 -0400 (0:00:00.315) 0:22:36.654 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Tuesday 19 August 2025 14:38:48 -0400 (0:00:00.111) 0:22:36.766 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Tuesday 19 August 2025 14:38:48 -0400 (0:00:00.157) 0:22:36.923 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Tuesday 19 August 2025 14:38:48 -0400 (0:00:00.097) 0:22:37.021 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node11 TASK [Set test variables] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Tuesday 19 August 2025 14:38:49 -0400 (0:00:00.504) 0:22:37.525 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Tuesday 19 August 2025 14:38:49 -0400 (0:00:00.307) 0:22:37.833 ******** skipping: [managed-node11] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Tuesday 19 August 2025 14:38:49 -0400 (0:00:00.163) 0:22:37.996 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node11 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Tuesday 19 August 2025 14:38:50 -0400 (0:00:00.290) 0:22:38.287 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Tuesday 19 August 2025 14:38:50 -0400 (0:00:00.195) 0:22:38.483 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Tuesday 19 August 2025 14:38:50 -0400 (0:00:00.149) 0:22:38.633 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Tuesday 19 August 2025 14:38:50 -0400 (0:00:00.099) 0:22:38.732 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Tuesday 19 August 2025 14:38:50 -0400 (0:00:00.167) 0:22:38.899 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Tuesday 19 August 2025 14:38:51 -0400 (0:00:00.307) 0:22:39.207 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Tuesday 19 August 2025 14:38:51 -0400 (0:00:00.182) 0:22:39.389 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Tuesday 19 August 2025 14:38:51 -0400 (0:00:00.540) 0:22:39.929 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node11 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Tuesday 19 August 2025 14:38:52 -0400 (0:00:00.349) 0:22:40.279 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node11 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Tuesday 19 August 2025 14:38:52 -0400 (0:00:00.310) 0:22:40.590 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Tuesday 19 August 2025 14:38:52 -0400 (0:00:00.159) 0:22:40.749 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Tuesday 19 August 2025 14:38:52 -0400 (0:00:00.150) 0:22:40.900 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Tuesday 19 August 2025 14:38:52 -0400 (0:00:00.171) 0:22:41.071 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off - 2] *********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Tuesday 19 August 2025 14:38:53 -0400 (0:00:00.224) 0:22:41.296 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on - 2] ************************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Tuesday 19 August 2025 14:38:53 -0400 (0:00:00.118) 0:22:41.414 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Tuesday 19 August 2025 14:38:53 -0400 (0:00:00.206) 0:22:41.621 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Tuesday 19 August 2025 14:38:53 -0400 (0:00:00.260) 0:22:41.881 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node11 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Tuesday 19 August 2025 14:38:54 -0400 (0:00:00.449) 0:22:42.331 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Tuesday 19 August 2025 14:38:54 -0400 (0:00:00.173) 0:22:42.505 ******** skipping: [managed-node11] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Tuesday 19 August 2025 14:38:54 -0400 (0:00:00.079) 0:22:42.584 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Tuesday 19 August 2025 14:38:54 -0400 (0:00:00.161) 0:22:42.746 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Tuesday 19 August 2025 14:38:54 -0400 (0:00:00.188) 0:22:42.934 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Tuesday 19 August 2025 14:38:55 -0400 (0:00:00.258) 0:22:43.193 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Tuesday 19 August 2025 14:38:55 -0400 (0:00:00.213) 0:22:43.407 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Tuesday 19 August 2025 14:38:55 -0400 (0:00:00.173) 0:22:43.581 ******** ok: [managed-node11] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Tuesday 19 August 2025 14:38:55 -0400 (0:00:00.086) 0:22:43.668 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node11 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Tuesday 19 August 2025 14:38:55 -0400 (0:00:00.419) 0:22:44.087 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Tuesday 19 August 2025 14:38:56 -0400 (0:00:00.304) 0:22:44.392 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node11 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Tuesday 19 August 2025 14:38:57 -0400 (0:00:01.192) 0:22:45.584 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-6c5e45cf-09f6-4725-bb5d-cf60b1ba6b1e" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Tuesday 19 August 2025 14:38:57 -0400 (0:00:00.302) 0:22:45.887 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Tuesday 19 August 2025 14:38:58 -0400 (0:00:00.277) 0:22:46.164 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Tuesday 19 August 2025 14:38:58 -0400 (0:00:00.219) 0:22:46.384 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Tuesday 19 August 2025 14:38:58 -0400 (0:00:00.182) 0:22:46.566 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Tuesday 19 August 2025 14:38:58 -0400 (0:00:00.307) 0:22:46.874 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Tuesday 19 August 2025 14:38:58 -0400 (0:00:00.198) 0:22:47.072 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Tuesday 19 August 2025 14:38:59 -0400 (0:00:00.165) 0:22:47.238 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Tuesday 19 August 2025 14:38:59 -0400 (0:00:00.396) 0:22:47.635 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Tuesday 19 August 2025 14:38:59 -0400 (0:00:00.248) 0:22:47.883 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Tuesday 19 August 2025 14:38:59 -0400 (0:00:00.213) 0:22:48.096 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Tuesday 19 August 2025 14:39:00 -0400 (0:00:00.299) 0:22:48.396 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-6c5e45cf-09f6-4725-bb5d-cf60b1ba6b1e " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Tuesday 19 August 2025 14:39:00 -0400 (0:00:00.401) 0:22:48.797 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Tuesday 19 August 2025 14:39:00 -0400 (0:00:00.206) 0:22:49.004 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Tuesday 19 August 2025 14:39:01 -0400 (0:00:00.387) 0:22:49.392 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Tuesday 19 August 2025 14:39:01 -0400 (0:00:00.261) 0:22:49.654 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Tuesday 19 August 2025 14:39:01 -0400 (0:00:00.274) 0:22:49.929 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Tuesday 19 August 2025 14:39:02 -0400 (0:00:00.306) 0:22:50.235 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Tuesday 19 August 2025 14:39:02 -0400 (0:00:00.303) 0:22:50.538 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Tuesday 19 August 2025 14:39:02 -0400 (0:00:00.369) 0:22:50.908 ******** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1755628692.8896527, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1755628692.8896527, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 266139, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1755628692.8896527, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Tuesday 19 August 2025 14:39:04 -0400 (0:00:01.442) 0:22:52.351 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Tuesday 19 August 2025 14:39:04 -0400 (0:00:00.240) 0:22:52.591 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Tuesday 19 August 2025 14:39:05 -0400 (0:00:00.695) 0:22:53.286 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Tuesday 19 August 2025 14:39:05 -0400 (0:00:00.192) 0:22:53.479 ******** ok: [managed-node11] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Tuesday 19 August 2025 14:39:05 -0400 (0:00:00.229) 0:22:53.708 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Tuesday 19 August 2025 14:39:05 -0400 (0:00:00.155) 0:22:53.864 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Tuesday 19 August 2025 14:39:05 -0400 (0:00:00.195) 0:22:54.060 ******** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1755628693.0286534, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1755628693.0286534, "dev": 6, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 282680, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1755628693.0286534, "nlink": 1, "path": "/dev/mapper/luks-6c5e45cf-09f6-4725-bb5d-cf60b1ba6b1e", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Tuesday 19 August 2025 14:39:07 -0400 (0:00:01.179) 0:22:55.239 ******** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Tuesday 19 August 2025 14:39:11 -0400 (0:00:04.558) 0:22:59.798 ******** ok: [managed-node11] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.009266", "end": "2025-08-19 14:39:12.775293", "rc": 0, "start": "2025-08-19 14:39:12.766027" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 6c5e45cf-09f6-4725-bb5d-cf60b1ba6b1e Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 948582 Threads: 2 Salt: a2 37 ca 8a 83 f9 79 78 57 61 b4 83 18 dc 01 35 83 9f 3c df e0 64 c6 f5 ca af e9 7f f1 e7 cc 71 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 120249 Salt: 33 48 f0 b1 8e 6b 8a 1c 05 36 df 80 33 dc bb eb 2c b4 b8 79 73 43 e8 48 ea 7e a1 55 47 01 d9 05 Digest: 85 71 27 2c fe 99 a4 aa 87 7f 75 f0 41 21 e5 13 82 68 ca 1d dc 86 8a fb a9 b5 a4 05 72 30 1d 82 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Tuesday 19 August 2025 14:39:13 -0400 (0:00:01.384) 0:23:01.183 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Tuesday 19 August 2025 14:39:13 -0400 (0:00:00.227) 0:23:01.410 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Tuesday 19 August 2025 14:39:13 -0400 (0:00:00.311) 0:23:01.721 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Tuesday 19 August 2025 14:39:13 -0400 (0:00:00.222) 0:23:01.944 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Tuesday 19 August 2025 14:39:14 -0400 (0:00:00.204) 0:23:02.148 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Tuesday 19 August 2025 14:39:14 -0400 (0:00:00.179) 0:23:02.328 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Tuesday 19 August 2025 14:39:14 -0400 (0:00:00.194) 0:23:02.522 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Tuesday 19 August 2025 14:39:14 -0400 (0:00:00.260) 0:23:02.783 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-6c5e45cf-09f6-4725-bb5d-cf60b1ba6b1e /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Tuesday 19 August 2025 14:39:14 -0400 (0:00:00.245) 0:23:03.028 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Tuesday 19 August 2025 14:39:15 -0400 (0:00:00.165) 0:23:03.193 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Tuesday 19 August 2025 14:39:15 -0400 (0:00:00.170) 0:23:03.363 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Tuesday 19 August 2025 14:39:15 -0400 (0:00:00.252) 0:23:03.616 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Tuesday 19 August 2025 14:39:15 -0400 (0:00:00.234) 0:23:03.850 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Tuesday 19 August 2025 14:39:15 -0400 (0:00:00.196) 0:23:04.047 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Tuesday 19 August 2025 14:39:16 -0400 (0:00:00.221) 0:23:04.269 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Tuesday 19 August 2025 14:39:16 -0400 (0:00:00.179) 0:23:04.449 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Tuesday 19 August 2025 14:39:16 -0400 (0:00:00.286) 0:23:04.735 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Tuesday 19 August 2025 14:39:16 -0400 (0:00:00.262) 0:23:04.998 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Tuesday 19 August 2025 14:39:17 -0400 (0:00:00.160) 0:23:05.158 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Tuesday 19 August 2025 14:39:17 -0400 (0:00:00.097) 0:23:05.255 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Tuesday 19 August 2025 14:39:17 -0400 (0:00:00.179) 0:23:05.435 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Tuesday 19 August 2025 14:39:17 -0400 (0:00:00.269) 0:23:05.704 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Tuesday 19 August 2025 14:39:17 -0400 (0:00:00.319) 0:23:06.024 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Tuesday 19 August 2025 14:39:18 -0400 (0:00:00.306) 0:23:06.330 ******** ok: [managed-node11] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Tuesday 19 August 2025 14:39:19 -0400 (0:00:01.301) 0:23:07.632 ******** ok: [managed-node11] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Tuesday 19 August 2025 14:39:20 -0400 (0:00:01.430) 0:23:09.062 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Tuesday 19 August 2025 14:39:21 -0400 (0:00:00.216) 0:23:09.279 ******** ok: [managed-node11] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Tuesday 19 August 2025 14:39:21 -0400 (0:00:00.315) 0:23:09.594 ******** ok: [managed-node11] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Tuesday 19 August 2025 14:39:22 -0400 (0:00:01.469) 0:23:11.064 ******** skipping: [managed-node11] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Tuesday 19 August 2025 14:39:23 -0400 (0:00:00.371) 0:23:11.436 ******** skipping: [managed-node11] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Tuesday 19 August 2025 14:39:23 -0400 (0:00:00.267) 0:23:11.703 ******** skipping: [managed-node11] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Tuesday 19 August 2025 14:39:23 -0400 (0:00:00.336) 0:23:12.040 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Tuesday 19 August 2025 14:39:24 -0400 (0:00:00.325) 0:23:12.365 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Tuesday 19 August 2025 14:39:24 -0400 (0:00:00.331) 0:23:12.696 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Tuesday 19 August 2025 14:39:24 -0400 (0:00:00.280) 0:23:12.977 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Tuesday 19 August 2025 14:39:25 -0400 (0:00:00.176) 0:23:13.153 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Tuesday 19 August 2025 14:39:25 -0400 (0:00:00.320) 0:23:13.473 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Tuesday 19 August 2025 14:39:25 -0400 (0:00:00.225) 0:23:13.699 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Tuesday 19 August 2025 14:39:25 -0400 (0:00:00.220) 0:23:13.919 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Tuesday 19 August 2025 14:39:26 -0400 (0:00:00.280) 0:23:14.200 ******** skipping: [managed-node11] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Tuesday 19 August 2025 14:39:26 -0400 (0:00:00.243) 0:23:14.443 ******** skipping: [managed-node11] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Tuesday 19 August 2025 14:39:26 -0400 (0:00:00.257) 0:23:14.701 ******** skipping: [managed-node11] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Tuesday 19 August 2025 14:39:26 -0400 (0:00:00.297) 0:23:14.999 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Tuesday 19 August 2025 14:39:27 -0400 (0:00:00.275) 0:23:15.274 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Tuesday 19 August 2025 14:39:27 -0400 (0:00:00.272) 0:23:15.547 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Tuesday 19 August 2025 14:39:27 -0400 (0:00:00.219) 0:23:15.767 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Tuesday 19 August 2025 14:39:27 -0400 (0:00:00.235) 0:23:16.002 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Tuesday 19 August 2025 14:39:28 -0400 (0:00:00.248) 0:23:16.250 ******** ok: [managed-node11] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Tuesday 19 August 2025 14:39:28 -0400 (0:00:00.218) 0:23:16.468 ******** ok: [managed-node11] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Tuesday 19 August 2025 14:39:28 -0400 (0:00:00.344) 0:23:16.813 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Tuesday 19 August 2025 14:39:29 -0400 (0:00:00.358) 0:23:17.171 ******** ok: [managed-node11] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.025098", "end": "2025-08-19 14:39:30.169320", "rc": 0, "start": "2025-08-19 14:39:30.144222" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Tuesday 19 August 2025 14:39:30 -0400 (0:00:01.473) 0:23:18.645 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Tuesday 19 August 2025 14:39:30 -0400 (0:00:00.292) 0:23:18.938 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Tuesday 19 August 2025 14:39:30 -0400 (0:00:00.147) 0:23:19.085 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Tuesday 19 August 2025 14:39:31 -0400 (0:00:00.296) 0:23:19.381 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Tuesday 19 August 2025 14:39:31 -0400 (0:00:00.264) 0:23:19.646 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Tuesday 19 August 2025 14:39:31 -0400 (0:00:00.257) 0:23:19.903 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Tuesday 19 August 2025 14:39:32 -0400 (0:00:00.305) 0:23:20.209 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Tuesday 19 August 2025 14:39:32 -0400 (0:00:00.211) 0:23:20.420 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Tuesday 19 August 2025 14:39:32 -0400 (0:00:00.173) 0:23:20.594 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:513 Tuesday 19 August 2025 14:39:32 -0400 (0:00:00.159) 0:23:20.754 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Tuesday 19 August 2025 14:39:34 -0400 (0:00:01.424) 0:23:22.178 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Tuesday 19 August 2025 14:39:34 -0400 (0:00:00.390) 0:23:22.569 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Tuesday 19 August 2025 14:39:34 -0400 (0:00:00.371) 0:23:22.940 ******** skipping: [managed-node11] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node11] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Tuesday 19 August 2025 14:39:35 -0400 (0:00:00.542) 0:23:23.482 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Tuesday 19 August 2025 14:39:35 -0400 (0:00:00.286) 0:23:23.768 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Tuesday 19 August 2025 14:39:35 -0400 (0:00:00.310) 0:23:24.079 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Tuesday 19 August 2025 14:39:36 -0400 (0:00:00.235) 0:23:24.315 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Tuesday 19 August 2025 14:39:36 -0400 (0:00:00.201) 0:23:24.516 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Tuesday 19 August 2025 14:39:36 -0400 (0:00:00.427) 0:23:24.943 ******** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Tuesday 19 August 2025 14:39:41 -0400 (0:00:04.645) 0:23:29.588 ******** ok: [managed-node11] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Tuesday 19 August 2025 14:39:41 -0400 (0:00:00.315) 0:23:29.904 ******** ok: [managed-node11] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "name": "foo", "state": "absent", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Tuesday 19 August 2025 14:39:42 -0400 (0:00:00.299) 0:23:30.203 ******** ok: [managed-node11] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Tuesday 19 August 2025 14:39:47 -0400 (0:00:05.345) 0:23:35.548 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Tuesday 19 August 2025 14:39:47 -0400 (0:00:00.319) 0:23:35.868 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Tuesday 19 August 2025 14:39:47 -0400 (0:00:00.211) 0:23:36.080 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Tuesday 19 August 2025 14:39:48 -0400 (0:00:00.265) 0:23:36.345 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Tuesday 19 August 2025 14:39:48 -0400 (0:00:00.221) 0:23:36.566 ******** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Tuesday 19 August 2025 14:39:52 -0400 (0:00:04.463) 0:23:41.029 ******** ok: [managed-node11] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Tuesday 19 August 2025 14:39:55 -0400 (0:00:02.579) 0:23:43.608 ******** ok: [managed-node11] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Tuesday 19 August 2025 14:39:55 -0400 (0:00:00.257) 0:23:43.866 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Tuesday 19 August 2025 14:39:55 -0400 (0:00:00.234) 0:23:44.100 ******** changed: [managed-node11] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-6c5e45cf-09f6-4725-bb5d-cf60b1ba6b1e", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-6c5e45cf-09f6-4725-bb5d-cf60b1ba6b1e", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-6c5e45cf-09f6-4725-bb5d-cf60b1ba6b1e", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-6c5e45cf-09f6-4725-bb5d-cf60b1ba6b1e", "state": "absent" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=rZPaos-Ceok-3XVM-1kaP-jB95-fFYd-ITZHBi", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Tuesday 19 August 2025 14:40:01 -0400 (0:00:05.869) 0:23:49.969 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Tuesday 19 August 2025 14:40:02 -0400 (0:00:00.317) 0:23:50.286 ******** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1755628702.5707085, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "971dd53a96fd15e5f9076a4ed3164dc1e1ce566d", "ctime": 1755628702.5687084, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 364904586, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1755628702.5687084, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "2238694571", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Tuesday 19 August 2025 14:40:03 -0400 (0:00:01.542) 0:23:51.829 ******** ok: [managed-node11] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Tuesday 19 August 2025 14:40:05 -0400 (0:00:01.496) 0:23:53.326 ******** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Tuesday 19 August 2025 14:40:05 -0400 (0:00:00.201) 0:23:53.528 ******** ok: [managed-node11] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-6c5e45cf-09f6-4725-bb5d-cf60b1ba6b1e", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-6c5e45cf-09f6-4725-bb5d-cf60b1ba6b1e", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-6c5e45cf-09f6-4725-bb5d-cf60b1ba6b1e", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-6c5e45cf-09f6-4725-bb5d-cf60b1ba6b1e", "state": "absent" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=rZPaos-Ceok-3XVM-1kaP-jB95-fFYd-ITZHBi", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Tuesday 19 August 2025 14:40:05 -0400 (0:00:00.264) 0:23:53.793 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Tuesday 19 August 2025 14:40:05 -0400 (0:00:00.223) 0:23:54.016 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=rZPaos-Ceok-3XVM-1kaP-jB95-fFYd-ITZHBi", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Tuesday 19 August 2025 14:40:06 -0400 (0:00:00.239) 0:23:54.256 ******** changed: [managed-node11] => (item={'src': '/dev/mapper/luks-6c5e45cf-09f6-4725-bb5d-cf60b1ba6b1e', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-6c5e45cf-09f6-4725-bb5d-cf60b1ba6b1e", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-6c5e45cf-09f6-4725-bb5d-cf60b1ba6b1e" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Tuesday 19 August 2025 14:40:07 -0400 (0:00:01.338) 0:23:55.594 ******** ok: [managed-node11] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Tuesday 19 August 2025 14:40:09 -0400 (0:00:01.953) 0:23:57.547 ******** TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Tuesday 19 August 2025 14:40:09 -0400 (0:00:00.149) 0:23:57.697 ******** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Tuesday 19 August 2025 14:40:09 -0400 (0:00:00.039) 0:23:57.736 ******** ok: [managed-node11] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Tuesday 19 August 2025 14:40:11 -0400 (0:00:01.691) 0:23:59.428 ******** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1755628714.136775, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "6ab8a53f02d2bf4d40538d21bd28a11e9dfcf277", "ctime": 1755628707.6317377, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 440402057, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1755628707.6307375, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "1255426852", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Tuesday 19 August 2025 14:40:12 -0400 (0:00:01.395) 0:24:00.824 ******** changed: [managed-node11] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-6c5e45cf-09f6-4725-bb5d-cf60b1ba6b1e', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-6c5e45cf-09f6-4725-bb5d-cf60b1ba6b1e", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Tuesday 19 August 2025 14:40:14 -0400 (0:00:01.576) 0:24:02.400 ******** ok: [managed-node11] TASK [Verify role results - 11] ************************************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:523 Tuesday 19 August 2025 14:40:16 -0400 (0:00:01.892) 0:24:04.293 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node11 TASK [Print out pool information] ********************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Tuesday 19 August 2025 14:40:16 -0400 (0:00:00.390) 0:24:04.683 ******** skipping: [managed-node11] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Tuesday 19 August 2025 14:40:16 -0400 (0:00:00.042) 0:24:04.725 ******** ok: [managed-node11] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=rZPaos-Ceok-3XVM-1kaP-jB95-fFYd-ITZHBi", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Tuesday 19 August 2025 14:40:16 -0400 (0:00:00.071) 0:24:04.797 ******** ok: [managed-node11] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Tuesday 19 August 2025 14:40:17 -0400 (0:00:01.223) 0:24:06.020 ******** ok: [managed-node11] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002122", "end": "2025-08-19 14:40:18.700642", "rc": 0, "start": "2025-08-19 14:40:18.698520" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Tuesday 19 August 2025 14:40:18 -0400 (0:00:00.999) 0:24:07.020 ******** ok: [managed-node11] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002100", "end": "2025-08-19 14:40:19.897590", "failed_when_result": false, "rc": 0, "start": "2025-08-19 14:40:19.895490" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Tuesday 19 August 2025 14:40:20 -0400 (0:00:01.242) 0:24:08.262 ******** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Tuesday 19 August 2025 14:40:20 -0400 (0:00:00.210) 0:24:08.472 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node11 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Tuesday 19 August 2025 14:40:20 -0400 (0:00:00.260) 0:24:08.733 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Tuesday 19 August 2025 14:40:20 -0400 (0:00:00.199) 0:24:08.932 ******** included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node11 included: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node11 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Tuesday 19 August 2025 14:40:21 -0400 (0:00:00.878) 0:24:09.811 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Tuesday 19 August 2025 14:40:21 -0400 (0:00:00.148) 0:24:09.960 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Tuesday 19 August 2025 14:40:22 -0400 (0:00:00.316) 0:24:10.276 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Tuesday 19 August 2025 14:40:22 -0400 (0:00:00.228) 0:24:10.504 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Tuesday 19 August 2025 14:40:22 -0400 (0:00:00.153) 0:24:10.657 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Tuesday 19 August 2025 14:40:22 -0400 (0:00:00.197) 0:24:10.854 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Tuesday 19 August 2025 14:40:22 -0400 (0:00:00.162) 0:24:11.017 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Tuesday 19 August 2025 14:40:23 -0400 (0:00:00.169) 0:24:11.186 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Tuesday 19 August 2025 14:40:23 -0400 (0:00:00.144) 0:24:11.331 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Tuesday 19 August 2025 14:40:23 -0400 (0:00:00.206) 0:24:11.538 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Tuesday 19 August 2025 14:40:23 -0400 (0:00:00.254) 0:24:11.793 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Tuesday 19 August 2025 14:40:23 -0400 (0:00:00.200) 0:24:11.993 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Tuesday 19 August 2025 14:40:24 -0400 (0:00:00.460) 0:24:12.453 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Tuesday 19 August 2025 14:40:24 -0400 (0:00:00.216) 0:24:12.670 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Tuesday 19 August 2025 14:40:24 -0400 (0:00:00.225) 0:24:12.896 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Tuesday 19 August 2025 14:40:25 -0400 (0:00:00.247) 0:24:13.144 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Tuesday 19 August 2025 14:40:25 -0400 (0:00:00.292) 0:24:13.436 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Tuesday 19 August 2025 14:40:25 -0400 (0:00:00.152) 0:24:13.588 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Tuesday 19 August 2025 14:40:25 -0400 (0:00:00.226) 0:24:13.815 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Tuesday 19 August 2025 14:40:25 -0400 (0:00:00.258) 0:24:14.073 ******** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1755628801.3702774, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1755628801.3702774, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 37038, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1755628801.3702774, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Tuesday 19 August 2025 14:40:27 -0400 (0:00:01.783) 0:24:15.857 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Tuesday 19 August 2025 14:40:27 -0400 (0:00:00.281) 0:24:16.138 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Tuesday 19 August 2025 14:40:28 -0400 (0:00:00.174) 0:24:16.313 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Tuesday 19 August 2025 14:40:28 -0400 (0:00:00.143) 0:24:16.456 ******** ok: [managed-node11] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Tuesday 19 August 2025 14:40:28 -0400 (0:00:00.197) 0:24:16.653 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Tuesday 19 August 2025 14:40:28 -0400 (0:00:00.316) 0:24:16.970 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Tuesday 19 August 2025 14:40:28 -0400 (0:00:00.161) 0:24:17.132 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Tuesday 19 August 2025 14:40:29 -0400 (0:00:00.217) 0:24:17.349 ******** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Tuesday 19 August 2025 14:40:33 -0400 (0:00:04.369) 0:24:21.718 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Tuesday 19 August 2025 14:40:33 -0400 (0:00:00.249) 0:24:21.968 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Tuesday 19 August 2025 14:40:34 -0400 (0:00:00.218) 0:24:22.187 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Tuesday 19 August 2025 14:40:34 -0400 (0:00:00.147) 0:24:22.334 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Tuesday 19 August 2025 14:40:34 -0400 (0:00:00.200) 0:24:22.535 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Tuesday 19 August 2025 14:40:34 -0400 (0:00:00.249) 0:24:22.784 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Tuesday 19 August 2025 14:40:34 -0400 (0:00:00.095) 0:24:22.880 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Tuesday 19 August 2025 14:40:34 -0400 (0:00:00.241) 0:24:23.121 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Tuesday 19 August 2025 14:40:35 -0400 (0:00:00.100) 0:24:23.221 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Tuesday 19 August 2025 14:40:35 -0400 (0:00:00.279) 0:24:23.501 ******** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Tuesday 19 August 2025 14:40:35 -0400 (0:00:00.249) 0:24:23.750 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Tuesday 19 August 2025 14:40:35 -0400 (0:00:00.232) 0:24:23.983 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Tuesday 19 August 2025 14:40:36 -0400 (0:00:00.230) 0:24:24.214 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Tuesday 19 August 2025 14:40:36 -0400 (0:00:00.249) 0:24:24.463 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Tuesday 19 August 2025 14:40:36 -0400 (0:00:00.187) 0:24:24.651 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Tuesday 19 August 2025 14:40:36 -0400 (0:00:00.233) 0:24:24.884 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Tuesday 19 August 2025 14:40:36 -0400 (0:00:00.250) 0:24:25.135 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Tuesday 19 August 2025 14:40:37 -0400 (0:00:00.247) 0:24:25.382 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Tuesday 19 August 2025 14:40:37 -0400 (0:00:00.203) 0:24:25.586 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Tuesday 19 August 2025 14:40:37 -0400 (0:00:00.283) 0:24:25.869 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Tuesday 19 August 2025 14:40:37 -0400 (0:00:00.172) 0:24:26.042 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Tuesday 19 August 2025 14:40:38 -0400 (0:00:00.173) 0:24:26.215 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Tuesday 19 August 2025 14:40:38 -0400 (0:00:00.202) 0:24:26.418 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Tuesday 19 August 2025 14:40:38 -0400 (0:00:00.172) 0:24:26.590 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Tuesday 19 August 2025 14:40:38 -0400 (0:00:00.188) 0:24:26.778 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Tuesday 19 August 2025 14:40:38 -0400 (0:00:00.204) 0:24:26.983 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Tuesday 19 August 2025 14:40:39 -0400 (0:00:00.222) 0:24:27.206 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Tuesday 19 August 2025 14:40:39 -0400 (0:00:00.254) 0:24:27.460 ******** ok: [managed-node11] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Tuesday 19 August 2025 14:40:39 -0400 (0:00:00.272) 0:24:27.733 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Tuesday 19 August 2025 14:40:39 -0400 (0:00:00.199) 0:24:27.933 ******** skipping: [managed-node11] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Tuesday 19 August 2025 14:40:39 -0400 (0:00:00.146) 0:24:28.080 ******** skipping: [managed-node11] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Tuesday 19 August 2025 14:40:40 -0400 (0:00:00.216) 0:24:28.296 ******** skipping: [managed-node11] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Tuesday 19 August 2025 14:40:40 -0400 (0:00:00.283) 0:24:28.580 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Tuesday 19 August 2025 14:40:40 -0400 (0:00:00.289) 0:24:28.870 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Tuesday 19 August 2025 14:40:40 -0400 (0:00:00.192) 0:24:29.063 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Tuesday 19 August 2025 14:40:41 -0400 (0:00:00.323) 0:24:29.386 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Tuesday 19 August 2025 14:40:41 -0400 (0:00:00.272) 0:24:29.658 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Tuesday 19 August 2025 14:40:41 -0400 (0:00:00.310) 0:24:29.969 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Tuesday 19 August 2025 14:40:42 -0400 (0:00:00.297) 0:24:30.266 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Tuesday 19 August 2025 14:40:42 -0400 (0:00:00.235) 0:24:30.502 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Tuesday 19 August 2025 14:40:42 -0400 (0:00:00.295) 0:24:30.798 ******** skipping: [managed-node11] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Tuesday 19 August 2025 14:40:42 -0400 (0:00:00.299) 0:24:31.098 ******** skipping: [managed-node11] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Tuesday 19 August 2025 14:40:43 -0400 (0:00:00.352) 0:24:31.450 ******** skipping: [managed-node11] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Tuesday 19 August 2025 14:40:43 -0400 (0:00:00.254) 0:24:31.705 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Tuesday 19 August 2025 14:40:43 -0400 (0:00:00.218) 0:24:31.923 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Tuesday 19 August 2025 14:40:44 -0400 (0:00:00.252) 0:24:32.175 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Tuesday 19 August 2025 14:40:44 -0400 (0:00:00.177) 0:24:32.352 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Tuesday 19 August 2025 14:40:44 -0400 (0:00:00.251) 0:24:32.604 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Tuesday 19 August 2025 14:40:44 -0400 (0:00:00.236) 0:24:32.841 ******** ok: [managed-node11] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Tuesday 19 August 2025 14:40:44 -0400 (0:00:00.267) 0:24:33.108 ******** ok: [managed-node11] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Tuesday 19 August 2025 14:40:45 -0400 (0:00:00.228) 0:24:33.337 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Tuesday 19 August 2025 14:40:45 -0400 (0:00:00.215) 0:24:33.553 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Tuesday 19 August 2025 14:40:45 -0400 (0:00:00.271) 0:24:33.824 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Tuesday 19 August 2025 14:40:45 -0400 (0:00:00.196) 0:24:34.021 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Tuesday 19 August 2025 14:40:46 -0400 (0:00:00.316) 0:24:34.338 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Tuesday 19 August 2025 14:40:46 -0400 (0:00:00.316) 0:24:34.654 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Tuesday 19 August 2025 14:40:46 -0400 (0:00:00.209) 0:24:34.863 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Tuesday 19 August 2025 14:40:46 -0400 (0:00:00.201) 0:24:35.064 ******** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Tuesday 19 August 2025 14:40:47 -0400 (0:00:00.252) 0:24:35.317 ******** ok: [managed-node11] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Tuesday 19 August 2025 14:40:47 -0400 (0:00:00.190) 0:24:35.507 ******** ok: [managed-node11] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* managed-node11 : ok=1224 changed=60 unreachable=0 failed=9 skipped=1073 rescued=9 ignored=0 SYSTEM ROLES ERRORS BEGIN v1 [ { "ansible_version": "2.9.27", "end_time": "2025-08-19T18:17:25.656998+00:00Z", "host": "managed-node11", "message": "encrypted volume 'foo' missing key/password", "start_time": "2025-08-19T18:17:21.241935+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.9.27", "end_time": "2025-08-19T18:17:25.990938+00:00Z", "host": "managed-node11", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } }, "leaves": [], "mounts": [], "msg": "encrypted volume 'foo' missing key/password", "packages": [], "pools": [], "volumes": [] }, "start_time": "2025-08-19T18:17:25.710226+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.9.27", "end_time": "2025-08-19T18:19:33.232798+00:00Z", "host": "managed-node11", "message": "cannot remove existing formatting on device 'luks-673852e6-53e2-4ad0-a5cc-94ae9b8114d2' in safe mode due to encryption removal", "start_time": "2025-08-19T18:19:27.977431+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.9.27", "end_time": "2025-08-19T18:19:33.496897+00:00Z", "host": "managed-node11", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'luks-673852e6-53e2-4ad0-a5cc-94ae9b8114d2' in safe mode due to encryption removal", "packages": [], "pools": [], "volumes": [] }, "start_time": "2025-08-19T18:19:33.304663+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.9.27", "end_time": "2025-08-19T18:21:22.347051+00:00Z", "host": "managed-node11", "message": "cannot remove existing formatting on device 'sda' in safe mode due to adding encryption", "start_time": "2025-08-19T18:21:17.171325+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.9.27", "end_time": "2025-08-19T18:21:22.504981+00:00Z", "host": "managed-node11", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'sda' in safe mode due to adding encryption", "packages": [], "pools": [], "volumes": [] }, "start_time": "2025-08-19T18:21:22.401535+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.9.27", "end_time": "2025-08-19T18:23:19.689297+00:00Z", "host": "managed-node11", "message": "encrypted volume 'test1' missing key/password", "start_time": "2025-08-19T18:23:14.434830+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.9.27", "end_time": "2025-08-19T18:23:19.823079+00:00Z", "host": "managed-node11", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "safe_mode": false, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "encrypted volume 'test1' missing key/password", "packages": [], "pools": [], "volumes": [] }, "start_time": "2025-08-19T18:23:19.714047+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.9.27", "end_time": "2025-08-19T18:25:43.638414+00:00Z", "host": "managed-node11", "message": "cannot remove existing formatting on device 'luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f' in safe mode due to encryption removal", "start_time": "2025-08-19T18:25:38.215457+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.9.27", "end_time": "2025-08-19T18:25:43.875965+00:00Z", "host": "managed-node11", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'luks-7d1089ca-0f6b-48fe-9a70-e7c81f42081f' in safe mode due to encryption removal", "packages": [], "pools": [], "volumes": [] }, "start_time": "2025-08-19T18:25:43.672140+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.9.27", "end_time": "2025-08-19T18:28:04.227024+00:00Z", "host": "managed-node11", "message": "cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption", "start_time": "2025-08-19T18:27:58.691440+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.9.27", "end_time": "2025-08-19T18:28:04.545620+00:00Z", "host": "managed-node11", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption", "packages": [], "pools": [], "volumes": [] }, "start_time": "2025-08-19T18:28:04.250660+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.9.27", "end_time": "2025-08-19T18:30:36.547318+00:00Z", "host": "managed-node11", "message": "encrypted volume 'test1' missing key/password", "start_time": "2025-08-19T18:30:31.130815+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.9.27", "end_time": "2025-08-19T18:30:36.721151+00:00Z", "host": "managed-node11", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "safe_mode": false, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "encrypted volume 'test1' missing key/password", "packages": [], "pools": [], "volumes": [] }, "start_time": "2025-08-19T18:30:36.566263+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.9.27", "end_time": "2025-08-19T18:35:03.760807+00:00Z", "host": "managed-node11", "message": "cannot remove existing formatting on device 'luks-394e1e52-cec6-40c8-bf74-b0e069960529' in safe mode due to encryption removal", "start_time": "2025-08-19T18:34:58.551645+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.9.27", "end_time": "2025-08-19T18:35:04.035196+00:00Z", "host": "managed-node11", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'luks-394e1e52-cec6-40c8-bf74-b0e069960529' in safe mode due to encryption removal", "packages": [], "pools": [], "volumes": [] }, "start_time": "2025-08-19T18:35:03.818620+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.9.27", "end_time": "2025-08-19T18:37:30.960176+00:00Z", "host": "managed-node11", "message": "cannot remove existing formatting on device 'test1' in safe mode due to adding encryption", "start_time": "2025-08-19T18:37:25.839508+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.9.27", "end_time": "2025-08-19T18:37:31.188270+00:00Z", "host": "managed-node11", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'test1' in safe mode due to adding encryption", "packages": [], "pools": [], "volumes": [] }, "start_time": "2025-08-19T18:37:30.982079+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" } ] SYSTEM ROLES ERRORS END v1 TASKS RECAP ******************************************************************** Tuesday 19 August 2025 14:40:47 -0400 (0:00:00.202) 0:24:35.709 ******** =============================================================================== fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 14.02s /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 13.73s /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 13.70s /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 13.47s /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 13.17s /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 12.16s /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.87s /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.85s /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Get required packages --------------- 5.78s /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.64s /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Get required packages --------------- 5.61s /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 fedora.linux_system_roles.storage : Get required packages --------------- 5.61s /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 fedora.linux_system_roles.storage : Make sure blivet is available ------- 5.59s /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.56s /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Get required packages --------------- 5.53s /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 fedora.linux_system_roles.storage : Get required packages --------------- 5.53s /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.46s /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.44s /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.43s /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Get required packages --------------- 5.41s /tmp/collections-yHL/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19