ansible-playbook 2.9.27 config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.9/site-packages/ansible executable location = /usr/local/bin/ansible-playbook python version = 3.9.19 (main, May 16 2024, 11:40:09) [GCC 8.5.0 20210514 (Red Hat 8.5.0-22)] No config file found; using defaults [WARNING]: running playbook inside collection fedora.linux_system_roles Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'jsonl', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: tests_swap.yml ******************************************************* 1 plays in /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_swap.yml PLAY [Test management of swap] ************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_swap.yml:2 Monday 20 April 2026 16:13:52 -0400 (0:00:00.369) 0:00:00.369 ********** ok: [managed-node8] META: ran handlers TASK [Include role to ensure packages are installed] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_swap.yml:11 Monday 20 April 2026 16:13:56 -0400 (0:00:04.712) 0:00:05.082 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node8 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 16:13:56 -0400 (0:00:00.165) 0:00:05.248 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 16:13:57 -0400 (0:00:00.444) 0:00:05.693 ********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 16:13:58 -0400 (0:00:01.015) 0:00:06.709 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 16:13:59 -0400 (0:00:00.564) 0:00:07.273 ********** ok: [managed-node8] TASK [fedora.linux_system_roles.storage : Record role begin fingerprint] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 16:14:01 -0400 (0:00:02.349) 0:00:09.623 ********** ok: [managed-node8] => { "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:13 Monday 20 April 2026 16:14:04 -0400 (0:00:02.731) 0:00:12.354 ********** skipping: [managed-node8] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node8] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node8] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node8] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:31 Monday 20 April 2026 16:14:04 -0400 (0:00:00.332) 0:00:12.686 ********** ok: [managed-node8] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:36 Monday 20 April 2026 16:14:06 -0400 (0:00:02.440) 0:00:15.127 ********** ok: [managed-node8] => { "ansible_facts": { "__storage_is_ostree": false }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 20 April 2026 16:14:07 -0400 (0:00:00.237) 0:00:15.365 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 16:14:07 -0400 (0:00:00.231) 0:00:15.596 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 16:14:07 -0400 (0:00:00.190) 0:00:15.787 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 16:14:08 -0400 (0:00:01.132) 0:00:16.920 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 16:14:08 -0400 (0:00:00.211) 0:00:17.131 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 16:14:09 -0400 (0:00:00.160) 0:00:17.292 ********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 16:14:15 -0400 (0:00:06.162) 0:00:23.455 ********** ok: [managed-node8] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 16:14:15 -0400 (0:00:00.296) 0:00:23.751 ********** ok: [managed-node8] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 16:14:15 -0400 (0:00:00.271) 0:00:24.022 ********** ok: [managed-node8] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 16:14:19 -0400 (0:00:03.847) 0:00:27.869 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 16:14:19 -0400 (0:00:00.378) 0:00:28.248 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 16:14:20 -0400 (0:00:00.262) 0:00:28.510 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 16:14:20 -0400 (0:00:00.254) 0:00:28.764 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 16:14:20 -0400 (0:00:00.174) 0:00:28.939 ********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 16:14:25 -0400 (0:00:04.519) 0:00:33.459 ********** ok: [managed-node8] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 16:14:29 -0400 (0:00:04.251) 0:00:37.711 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 16:14:29 -0400 (0:00:00.445) 0:00:38.156 ********** ok: [managed-node8] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Monday 20 April 2026 16:14:31 -0400 (0:00:01.717) 0:00:39.874 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Monday 20 April 2026 16:14:31 -0400 (0:00:00.209) 0:00:40.084 ********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1776715729.6096573, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "ab8070345774adad92683e9645714452be7be474", "ctime": 1776715728.5716538, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 142606535, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776715728.5716538, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1343, "uid": 0, "version": "3187282743", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 20 April 2026 16:14:33 -0400 (0:00:01.292) 0:00:41.376 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 16:14:33 -0400 (0:00:00.241) 0:00:41.617 ********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Monday 20 April 2026 16:14:33 -0400 (0:00:00.339) 0:00:41.956 ********** ok: [managed-node8] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Monday 20 April 2026 16:14:33 -0400 (0:00:00.201) 0:00:42.158 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Monday 20 April 2026 16:14:34 -0400 (0:00:00.245) 0:00:42.404 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Monday 20 April 2026 16:14:34 -0400 (0:00:00.279) 0:00:42.683 ********** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Monday 20 April 2026 16:14:34 -0400 (0:00:00.155) 0:00:42.839 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Monday 20 April 2026 16:14:34 -0400 (0:00:00.178) 0:00:43.017 ********** TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Monday 20 April 2026 16:14:35 -0400 (0:00:00.241) 0:00:43.259 ********** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Monday 20 April 2026 16:14:35 -0400 (0:00:00.219) 0:00:43.479 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Monday 20 April 2026 16:14:35 -0400 (0:00:00.212) 0:00:43.691 ********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1776713110.4423337, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1716968941.893, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 135, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1716968586.525, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "1157759751", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Monday 20 April 2026 16:14:36 -0400 (0:00:01.452) 0:00:45.144 ********** TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Monday 20 April 2026 16:14:37 -0400 (0:00:00.144) 0:00:45.288 ********** ok: [managed-node8] TASK [fedora.linux_system_roles.storage : Record role success fingerprint] ***** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:16 Monday 20 April 2026 16:14:38 -0400 (0:00:01.629) 0:00:46.918 ********** ok: [managed-node8] => { "changed": false } TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_swap.yml:14 Monday 20 April 2026 16:14:40 -0400 (0:00:01.405) 0:00:48.323 ********** ok: [managed-node8] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed" ] }, "changed": false } TASK [Get unused disks for swap] *********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_swap.yml:21 Monday 20 April 2026 16:14:40 -0400 (0:00:00.169) 0:00:48.492 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml for managed-node8 TASK [Ensure test packages] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:2 Monday 20 April 2026 16:14:40 -0400 (0:00:00.305) 0:00:48.798 ********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Find unused disks in the system] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:11 Monday 20 April 2026 16:14:44 -0400 (0:00:03.842) 0:00:52.641 ********** ok: [managed-node8] => { "changed": false, "disks": [ "sda", "sdb" ], "info": [ "Line: NAME=\"/dev/sda\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdb\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdc\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdd\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sde\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdf\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdg\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdh\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdi\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/xvda\" TYPE=\"disk\" SIZE=\"268435456000\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/xvda1\" TYPE=\"part\" SIZE=\"268434390528\" FSTYPE=\"xfs\" LOG-SEC=\"512\"", "Line type [part] is not disk: NAME=\"/dev/xvda1\" TYPE=\"part\" SIZE=\"268434390528\" FSTYPE=\"xfs\" LOG-SEC=\"512\"", "filename [xvda1] is a partition", "Disk [/dev/xvda] attrs [{'type': 'disk', 'size': '268435456000', 'fstype': '', 'ssize': '512'}] has partitions" ] } TASK [Debug why there are no unused disks] ************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:20 Monday 20 April 2026 16:14:47 -0400 (0:00:02.648) 0:00:55.289 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:29 Monday 20 April 2026 16:14:47 -0400 (0:00:00.194) 0:00:55.484 ********** ok: [managed-node8] => { "ansible_facts": { "unused_disks": [ "sda", "sdb" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:34 Monday 20 April 2026 16:14:47 -0400 (0:00:00.148) 0:00:55.633 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:39 Monday 20 April 2026 16:14:47 -0400 (0:00:00.179) 0:00:55.813 ********** ok: [managed-node8] => { "unused_disks": [ "sda", "sdb" ] } TASK [Create a disk device with swap] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_swap.yml:29 Monday 20 April 2026 16:14:47 -0400 (0:00:00.168) 0:00:55.981 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node8 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 16:14:47 -0400 (0:00:00.204) 0:00:56.186 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 16:14:48 -0400 (0:00:00.202) 0:00:56.389 ********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 16:14:48 -0400 (0:00:00.256) 0:00:56.645 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 16:14:48 -0400 (0:00:00.161) 0:00:56.807 ********** ok: [managed-node8] TASK [fedora.linux_system_roles.storage : Record role begin fingerprint] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 16:14:50 -0400 (0:00:02.340) 0:00:59.148 ********** ok: [managed-node8] => { "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:13 Monday 20 April 2026 16:14:52 -0400 (0:00:01.516) 0:01:00.664 ********** skipping: [managed-node8] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node8] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node8] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node8] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:31 Monday 20 April 2026 16:14:52 -0400 (0:00:00.537) 0:01:01.202 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:36 Monday 20 April 2026 16:14:53 -0400 (0:00:00.274) 0:01:01.477 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 20 April 2026 16:14:53 -0400 (0:00:00.276) 0:01:01.754 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 16:14:53 -0400 (0:00:00.190) 0:01:01.944 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 16:14:53 -0400 (0:00:00.163) 0:01:02.108 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 16:14:54 -0400 (0:00:00.480) 0:01:02.588 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 16:14:54 -0400 (0:00:00.188) 0:01:02.777 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 16:14:54 -0400 (0:00:00.207) 0:01:02.985 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 16:14:54 -0400 (0:00:00.183) 0:01:03.168 ********** ok: [managed-node8] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 16:14:55 -0400 (0:00:00.219) 0:01:03.387 ********** ok: [managed-node8] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "fs_type": "swap", "name": "test1", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 16:14:55 -0400 (0:00:00.197) 0:01:03.584 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 16:14:55 -0400 (0:00:00.153) 0:01:03.737 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 16:14:55 -0400 (0:00:00.140) 0:01:03.878 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 16:14:55 -0400 (0:00:00.111) 0:01:03.990 ********** ok: [managed-node8] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 16:14:58 -0400 (0:00:02.468) 0:01:06.459 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 16:14:58 -0400 (0:00:00.243) 0:01:06.703 ********** changed: [managed-node8] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "swap" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "dump": 0, "fstype": "swap", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "none", "src": "UUID=7186789c-f155-4eca-9acb-da069dd36134", "state": "present" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=7186789c-f155-4eca-9acb-da069dd36134", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "swap", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Monday 20 April 2026 16:15:03 -0400 (0:00:05.177) 0:01:11.881 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Monday 20 April 2026 16:15:03 -0400 (0:00:00.265) 0:01:12.146 ********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1776715729.6096573, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "ab8070345774adad92683e9645714452be7be474", "ctime": 1776715728.5716538, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 142606535, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776715728.5716538, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1343, "uid": 0, "version": "3187282743", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 20 April 2026 16:15:05 -0400 (0:00:01.384) 0:01:13.531 ********** ok: [managed-node8] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 16:15:08 -0400 (0:00:02.776) 0:01:16.307 ********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Monday 20 April 2026 16:15:08 -0400 (0:00:00.341) 0:01:16.649 ********** ok: [managed-node8] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "swap" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "dump": 0, "fstype": "swap", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "none", "src": "UUID=7186789c-f155-4eca-9acb-da069dd36134", "state": "present" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=7186789c-f155-4eca-9acb-da069dd36134", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "swap", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Monday 20 April 2026 16:15:08 -0400 (0:00:00.255) 0:01:16.904 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Monday 20 April 2026 16:15:08 -0400 (0:00:00.210) 0:01:17.115 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=7186789c-f155-4eca-9acb-da069dd36134", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "swap", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Monday 20 April 2026 16:15:09 -0400 (0:00:00.148) 0:01:17.263 ********** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Monday 20 April 2026 16:15:09 -0400 (0:00:00.132) 0:01:17.396 ********** ok: [managed-node8] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Monday 20 April 2026 16:15:13 -0400 (0:00:04.197) 0:01:21.594 ********** changed: [managed-node8] => (item={'src': 'UUID=7186789c-f155-4eca-9acb-da069dd36134', 'path': 'none', 'fstype': 'swap', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'present', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "swap", "mount_info": { "dump": 0, "fstype": "swap", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "none", "src": "UUID=7186789c-f155-4eca-9acb-da069dd36134", "state": "present" }, "name": "none", "opts": "defaults", "passno": "0", "src": "UUID=7186789c-f155-4eca-9acb-da069dd36134" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Monday 20 April 2026 16:15:15 -0400 (0:00:02.147) 0:01:23.741 ********** skipping: [managed-node8] => (item={'src': 'UUID=7186789c-f155-4eca-9acb-da069dd36134', 'path': 'none', 'fstype': 'swap', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'present', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "swap", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "none", "src": "UUID=7186789c-f155-4eca-9acb-da069dd36134", "state": "present" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Monday 20 April 2026 16:15:15 -0400 (0:00:00.359) 0:01:24.100 ********** ok: [managed-node8] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Monday 20 April 2026 16:15:17 -0400 (0:00:01.229) 0:01:25.330 ********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1776713110.4423337, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1716968941.893, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 135, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1716968586.525, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "1157759751", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Monday 20 April 2026 16:15:18 -0400 (0:00:01.177) 0:01:26.508 ********** TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Monday 20 April 2026 16:15:18 -0400 (0:00:00.099) 0:01:26.607 ********** ok: [managed-node8] TASK [fedora.linux_system_roles.storage : Record role success fingerprint] ***** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:16 Monday 20 April 2026 16:15:20 -0400 (0:00:01.726) 0:01:28.334 ********** ok: [managed-node8] => { "changed": false } TASK [Verify results] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_swap.yml:38 Monday 20 April 2026 16:15:21 -0400 (0:00:01.210) 0:01:29.545 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node8 TASK [Print out pool information] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Monday 20 April 2026 16:15:21 -0400 (0:00:00.355) 0:01:29.900 ********** skipping: [managed-node8] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Monday 20 April 2026 16:15:21 -0400 (0:00:00.209) 0:01:30.110 ********** ok: [managed-node8] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=7186789c-f155-4eca-9acb-da069dd36134", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "swap", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Monday 20 April 2026 16:15:22 -0400 (0:00:00.188) 0:01:30.299 ********** ok: [managed-node8] => { "changed": false, "info": { "/dev/sda": { "fstype": "swap", "label": "", "mountpoint": "[SWAP]", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "7186789c-f155-4eca-9acb-da069dd36134" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Monday 20 April 2026 16:15:23 -0400 (0:00:01.952) 0:01:32.251 ********** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003159", "end": "2026-04-20 16:15:26.209777", "rc": 0, "start": "2026-04-20 16:15:26.206618" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 UUID=7186789c-f155-4eca-9acb-da069dd36134 none swap defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Monday 20 April 2026 16:15:26 -0400 (0:00:02.482) 0:01:34.734 ********** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:01.003873", "end": "2026-04-20 16:15:28.531193", "failed_when_result": false, "rc": 0, "start": "2026-04-20 16:15:27.527320" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Monday 20 April 2026 16:15:28 -0400 (0:00:02.157) 0:01:36.891 ********** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Monday 20 April 2026 16:15:28 -0400 (0:00:00.113) 0:01:37.005 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node8 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Monday 20 April 2026 16:15:28 -0400 (0:00:00.221) 0:01:37.227 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Monday 20 April 2026 16:15:29 -0400 (0:00:00.167) 0:01:37.394 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node8 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node8 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node8 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node8 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node8 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node8 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node8 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node8 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Monday 20 April 2026 16:15:30 -0400 (0:00:01.193) 0:01:38.588 ********** ok: [managed-node8] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Monday 20 April 2026 16:15:30 -0400 (0:00:00.222) 0:01:38.810 ********** ok: [managed-node8] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "[SWAP]", "storage_test_swap_expected_matches": "1" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Monday 20 April 2026 16:15:30 -0400 (0:00:00.235) 0:01:39.046 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Monday 20 April 2026 16:15:30 -0400 (0:00:00.158) 0:01:39.205 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Monday 20 April 2026 16:15:31 -0400 (0:00:00.236) 0:01:39.442 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Monday 20 April 2026 16:15:31 -0400 (0:00:00.236) 0:01:39.678 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Monday 20 April 2026 16:15:31 -0400 (0:00:00.285) 0:01:39.963 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Monday 20 April 2026 16:15:31 -0400 (0:00:00.177) 0:01:40.140 ********** ok: [managed-node8] => { "changed": false, "cmd": [ "realpath", "/dev/sda" ], "delta": "0:00:00.002618", "end": "2026-04-20 16:15:32.803103", "rc": 0, "start": "2026-04-20 16:15:32.800485" } STDOUT: /dev/sda TASK [Gather swap info] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Monday 20 April 2026 16:15:33 -0400 (0:00:01.114) 0:01:41.255 ********** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/proc/swaps" ], "delta": "0:00:00.002473", "end": "2026-04-20 16:15:33.802719", "rc": 0, "start": "2026-04-20 16:15:33.800246" } STDOUT: Filename Type Size Used Priority /dev/sda partition 10485756 0 -2 TASK [Verify swap status] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Monday 20 April 2026 16:15:34 -0400 (0:00:01.011) 0:01:42.267 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Unset facts] ************************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Monday 20 April 2026 16:15:34 -0400 (0:00:00.159) 0:01:42.426 ********** ok: [managed-node8] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Monday 20 April 2026 16:15:34 -0400 (0:00:00.123) 0:01:42.549 ********** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [ "UUID=7186789c-f155-4eca-9acb-da069dd36134 " ], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Monday 20 April 2026 16:15:34 -0400 (0:00:00.408) 0:01:42.958 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Monday 20 April 2026 16:15:34 -0400 (0:00:00.278) 0:01:43.237 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Monday 20 April 2026 16:15:35 -0400 (0:00:00.143) 0:01:43.381 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Monday 20 April 2026 16:15:35 -0400 (0:00:00.130) 0:01:43.511 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Monday 20 April 2026 16:15:35 -0400 (0:00:00.170) 0:01:43.682 ********** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Monday 20 April 2026 16:15:35 -0400 (0:00:00.097) 0:01:43.779 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Monday 20 April 2026 16:15:35 -0400 (0:00:00.283) 0:01:44.062 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Monday 20 April 2026 16:15:36 -0400 (0:00:00.527) 0:01:44.591 ********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1776716103.3468757, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776716103.3288755, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 40009, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1776716103.3288755, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Monday 20 April 2026 16:15:37 -0400 (0:00:01.129) 0:01:45.721 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Monday 20 April 2026 16:15:37 -0400 (0:00:00.184) 0:01:45.905 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Monday 20 April 2026 16:15:37 -0400 (0:00:00.238) 0:01:46.143 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Monday 20 April 2026 16:15:38 -0400 (0:00:00.279) 0:01:46.423 ********** ok: [managed-node8] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Monday 20 April 2026 16:15:38 -0400 (0:00:00.187) 0:01:46.611 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Monday 20 April 2026 16:15:38 -0400 (0:00:00.099) 0:01:46.710 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Monday 20 April 2026 16:15:38 -0400 (0:00:00.198) 0:01:46.909 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Monday 20 April 2026 16:15:38 -0400 (0:00:00.199) 0:01:47.109 ********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Monday 20 April 2026 16:15:42 -0400 (0:00:03.845) 0:01:50.955 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Monday 20 April 2026 16:15:42 -0400 (0:00:00.265) 0:01:51.221 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Monday 20 April 2026 16:15:43 -0400 (0:00:00.247) 0:01:51.468 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Monday 20 April 2026 16:15:43 -0400 (0:00:00.194) 0:01:51.663 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Monday 20 April 2026 16:15:43 -0400 (0:00:00.137) 0:01:51.801 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Monday 20 April 2026 16:15:43 -0400 (0:00:00.188) 0:01:51.989 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Monday 20 April 2026 16:15:43 -0400 (0:00:00.173) 0:01:52.163 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Monday 20 April 2026 16:15:44 -0400 (0:00:00.113) 0:01:52.277 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Monday 20 April 2026 16:15:44 -0400 (0:00:00.151) 0:01:52.429 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Monday 20 April 2026 16:15:44 -0400 (0:00:00.233) 0:01:52.662 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Monday 20 April 2026 16:15:44 -0400 (0:00:00.231) 0:01:52.893 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Monday 20 April 2026 16:15:44 -0400 (0:00:00.221) 0:01:53.115 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Monday 20 April 2026 16:15:45 -0400 (0:00:00.255) 0:01:53.370 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Monday 20 April 2026 16:15:45 -0400 (0:00:00.207) 0:01:53.577 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Monday 20 April 2026 16:15:45 -0400 (0:00:00.153) 0:01:53.731 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Monday 20 April 2026 16:15:45 -0400 (0:00:00.197) 0:01:53.928 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Monday 20 April 2026 16:15:45 -0400 (0:00:00.189) 0:01:54.117 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Monday 20 April 2026 16:15:46 -0400 (0:00:00.220) 0:01:54.338 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Monday 20 April 2026 16:15:46 -0400 (0:00:00.162) 0:01:54.500 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Monday 20 April 2026 16:15:46 -0400 (0:00:00.114) 0:01:54.615 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Monday 20 April 2026 16:15:46 -0400 (0:00:00.215) 0:01:54.830 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Monday 20 April 2026 16:15:46 -0400 (0:00:00.114) 0:01:54.945 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Monday 20 April 2026 16:15:46 -0400 (0:00:00.155) 0:01:55.101 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Monday 20 April 2026 16:15:46 -0400 (0:00:00.127) 0:01:55.228 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Monday 20 April 2026 16:15:47 -0400 (0:00:00.147) 0:01:55.376 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Monday 20 April 2026 16:15:47 -0400 (0:00:00.204) 0:01:55.580 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Monday 20 April 2026 16:15:47 -0400 (0:00:00.152) 0:01:55.733 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Monday 20 April 2026 16:15:47 -0400 (0:00:00.132) 0:01:55.866 ********** ok: [managed-node8] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Monday 20 April 2026 16:15:47 -0400 (0:00:00.113) 0:01:55.980 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Monday 20 April 2026 16:15:47 -0400 (0:00:00.136) 0:01:56.117 ********** skipping: [managed-node8] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Monday 20 April 2026 16:15:47 -0400 (0:00:00.072) 0:01:56.189 ********** skipping: [managed-node8] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Monday 20 April 2026 16:15:48 -0400 (0:00:00.077) 0:01:56.267 ********** skipping: [managed-node8] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Monday 20 April 2026 16:15:48 -0400 (0:00:00.128) 0:01:56.396 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Monday 20 April 2026 16:15:48 -0400 (0:00:00.154) 0:01:56.550 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Monday 20 April 2026 16:15:48 -0400 (0:00:00.130) 0:01:56.681 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Monday 20 April 2026 16:15:48 -0400 (0:00:00.121) 0:01:56.803 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Monday 20 April 2026 16:15:48 -0400 (0:00:00.131) 0:01:56.935 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Monday 20 April 2026 16:15:48 -0400 (0:00:00.169) 0:01:57.104 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Monday 20 April 2026 16:15:48 -0400 (0:00:00.134) 0:01:57.238 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Monday 20 April 2026 16:15:49 -0400 (0:00:00.157) 0:01:57.396 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Monday 20 April 2026 16:15:49 -0400 (0:00:00.111) 0:01:57.507 ********** skipping: [managed-node8] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Monday 20 April 2026 16:15:49 -0400 (0:00:00.151) 0:01:57.658 ********** skipping: [managed-node8] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Monday 20 April 2026 16:15:49 -0400 (0:00:00.120) 0:01:57.779 ********** skipping: [managed-node8] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Monday 20 April 2026 16:15:49 -0400 (0:00:00.168) 0:01:57.948 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Monday 20 April 2026 16:15:49 -0400 (0:00:00.133) 0:01:58.081 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Monday 20 April 2026 16:15:49 -0400 (0:00:00.118) 0:01:58.199 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Monday 20 April 2026 16:15:50 -0400 (0:00:00.145) 0:01:58.344 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Monday 20 April 2026 16:15:50 -0400 (0:00:00.068) 0:01:58.413 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Monday 20 April 2026 16:15:50 -0400 (0:00:00.111) 0:01:58.525 ********** ok: [managed-node8] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Monday 20 April 2026 16:15:50 -0400 (0:00:00.101) 0:01:58.626 ********** ok: [managed-node8] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Monday 20 April 2026 16:15:50 -0400 (0:00:00.143) 0:01:58.769 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Monday 20 April 2026 16:15:50 -0400 (0:00:00.147) 0:01:58.917 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Monday 20 April 2026 16:15:50 -0400 (0:00:00.183) 0:01:59.101 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Monday 20 April 2026 16:15:50 -0400 (0:00:00.100) 0:01:59.202 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Monday 20 April 2026 16:15:51 -0400 (0:00:00.147) 0:01:59.350 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Monday 20 April 2026 16:15:51 -0400 (0:00:00.088) 0:01:59.438 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Monday 20 April 2026 16:15:51 -0400 (0:00:00.042) 0:01:59.481 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Monday 20 April 2026 16:15:51 -0400 (0:00:00.102) 0:01:59.583 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Monday 20 April 2026 16:15:51 -0400 (0:00:00.152) 0:01:59.736 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Monday 20 April 2026 16:15:51 -0400 (0:00:00.138) 0:01:59.874 ********** ok: [managed-node8] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Format second disk as ext3] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_swap.yml:41 Monday 20 April 2026 16:15:51 -0400 (0:00:00.111) 0:01:59.986 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node8 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 16:15:52 -0400 (0:00:00.270) 0:02:00.257 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 16:15:52 -0400 (0:00:00.123) 0:02:00.380 ********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 16:15:52 -0400 (0:00:00.062) 0:02:00.443 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 16:15:52 -0400 (0:00:00.074) 0:02:00.517 ********** ok: [managed-node8] TASK [fedora.linux_system_roles.storage : Record role begin fingerprint] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 16:15:53 -0400 (0:00:01.456) 0:02:01.974 ********** ok: [managed-node8] => { "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:13 Monday 20 April 2026 16:15:54 -0400 (0:00:00.819) 0:02:02.794 ********** skipping: [managed-node8] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node8] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node8] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node8] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:31 Monday 20 April 2026 16:15:54 -0400 (0:00:00.221) 0:02:03.016 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:36 Monday 20 April 2026 16:15:54 -0400 (0:00:00.101) 0:02:03.118 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 20 April 2026 16:15:54 -0400 (0:00:00.104) 0:02:03.223 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 16:15:55 -0400 (0:00:00.104) 0:02:03.327 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 16:15:55 -0400 (0:00:00.072) 0:02:03.399 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 16:15:55 -0400 (0:00:00.195) 0:02:03.595 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 16:15:55 -0400 (0:00:00.160) 0:02:03.755 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 16:15:55 -0400 (0:00:00.142) 0:02:03.898 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 16:15:55 -0400 (0:00:00.117) 0:02:04.015 ********** ok: [managed-node8] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 16:15:55 -0400 (0:00:00.097) 0:02:04.113 ********** ok: [managed-node8] => { "storage_volumes | d([])": [ { "disks": [ "sdb" ], "fs_type": "ext3", "mount_point": "none", "name": "test2", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 16:15:56 -0400 (0:00:00.156) 0:02:04.270 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 16:15:56 -0400 (0:00:00.152) 0:02:04.423 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 16:15:56 -0400 (0:00:00.108) 0:02:04.532 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 16:15:56 -0400 (0:00:00.112) 0:02:04.644 ********** ok: [managed-node8] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 16:15:58 -0400 (0:00:02.334) 0:02:06.979 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 16:15:58 -0400 (0:00:00.193) 0:02:07.173 ********** changed: [managed-node8] => { "actions": [ { "action": "create format", "device": "/dev/sdb", "fs_type": "ext3" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [], "packages": [ "xfsprogs", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sdb", "_kernel_device": "/dev/sdb", "_mount_id": "UUID=61b4fe33-23e6-4591-8ccf-71b352914ac9", "_raw_device": "/dev/sdb", "_raw_kernel_device": "/dev/sdb", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext3", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "none", "mount_user": null, "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Monday 20 April 2026 16:16:07 -0400 (0:00:08.125) 0:02:15.298 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Monday 20 April 2026 16:16:07 -0400 (0:00:00.131) 0:02:15.430 ********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1776716116.7909212, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "15d4492999e2e8a58c3b65212ef474e02eba5ec1", "ctime": 1776716115.1919158, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 142606535, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776716115.1919158, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1408, "uid": 0, "version": "3187282743", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 20 April 2026 16:16:08 -0400 (0:00:00.944) 0:02:16.374 ********** ok: [managed-node8] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 16:16:09 -0400 (0:00:00.910) 0:02:17.285 ********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Monday 20 April 2026 16:16:09 -0400 (0:00:00.182) 0:02:17.468 ********** ok: [managed-node8] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sdb", "fs_type": "ext3" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [], "packages": [ "xfsprogs", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sdb", "_kernel_device": "/dev/sdb", "_mount_id": "UUID=61b4fe33-23e6-4591-8ccf-71b352914ac9", "_raw_device": "/dev/sdb", "_raw_kernel_device": "/dev/sdb", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext3", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "none", "mount_user": null, "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Monday 20 April 2026 16:16:09 -0400 (0:00:00.142) 0:02:17.611 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Monday 20 April 2026 16:16:09 -0400 (0:00:00.096) 0:02:17.707 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sdb", "_kernel_device": "/dev/sdb", "_mount_id": "UUID=61b4fe33-23e6-4591-8ccf-71b352914ac9", "_raw_device": "/dev/sdb", "_raw_kernel_device": "/dev/sdb", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext3", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "none", "mount_user": null, "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Monday 20 April 2026 16:16:09 -0400 (0:00:00.124) 0:02:17.832 ********** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Monday 20 April 2026 16:16:09 -0400 (0:00:00.179) 0:02:18.012 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Monday 20 April 2026 16:16:09 -0400 (0:00:00.060) 0:02:18.072 ********** TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Monday 20 April 2026 16:16:09 -0400 (0:00:00.118) 0:02:18.191 ********** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Monday 20 April 2026 16:16:10 -0400 (0:00:00.120) 0:02:18.311 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Monday 20 April 2026 16:16:10 -0400 (0:00:00.136) 0:02:18.448 ********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1776713110.4423337, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1716968941.893, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 135, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1716968586.525, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "1157759751", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Monday 20 April 2026 16:16:11 -0400 (0:00:00.956) 0:02:19.405 ********** TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Monday 20 April 2026 16:16:11 -0400 (0:00:00.051) 0:02:19.456 ********** ok: [managed-node8] TASK [fedora.linux_system_roles.storage : Record role success fingerprint] ***** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:16 Monday 20 April 2026 16:16:12 -0400 (0:00:01.178) 0:02:20.634 ********** ok: [managed-node8] => { "changed": false } TASK [Verify results - 2] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_swap.yml:51 Monday 20 April 2026 16:16:13 -0400 (0:00:00.816) 0:02:21.451 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node8 TASK [Print out pool information] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Monday 20 April 2026 16:16:13 -0400 (0:00:00.164) 0:02:21.616 ********** skipping: [managed-node8] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Monday 20 April 2026 16:16:13 -0400 (0:00:00.031) 0:02:21.647 ********** ok: [managed-node8] => { "_storage_volumes_list": [ { "_device": "/dev/sdb", "_kernel_device": "/dev/sdb", "_mount_id": "UUID=61b4fe33-23e6-4591-8ccf-71b352914ac9", "_raw_device": "/dev/sdb", "_raw_kernel_device": "/dev/sdb", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext3", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "none", "mount_user": null, "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Monday 20 April 2026 16:16:13 -0400 (0:00:00.059) 0:02:21.707 ********** ok: [managed-node8] => { "changed": false, "info": { "/dev/sda": { "fstype": "swap", "label": "", "mountpoint": "[SWAP]", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "7186789c-f155-4eca-9acb-da069dd36134" }, "/dev/sdb": { "fstype": "ext3", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "61b4fe33-23e6-4591-8ccf-71b352914ac9" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Monday 20 April 2026 16:16:14 -0400 (0:00:00.844) 0:02:22.552 ********** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002270", "end": "2026-04-20 16:16:14.909471", "rc": 0, "start": "2026-04-20 16:16:14.907201" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 UUID=7186789c-f155-4eca-9acb-da069dd36134 none swap defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Monday 20 April 2026 16:16:15 -0400 (0:00:00.795) 0:02:23.347 ********** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002409", "end": "2026-04-20 16:16:15.787497", "failed_when_result": false, "rc": 0, "start": "2026-04-20 16:16:15.785088" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Monday 20 April 2026 16:16:15 -0400 (0:00:00.831) 0:02:24.178 ********** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Monday 20 April 2026 16:16:15 -0400 (0:00:00.045) 0:02:24.224 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node8 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Monday 20 April 2026 16:16:16 -0400 (0:00:00.133) 0:02:24.358 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Monday 20 April 2026 16:16:16 -0400 (0:00:00.055) 0:02:24.414 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node8 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node8 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node8 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node8 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node8 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node8 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node8 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node8 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Monday 20 April 2026 16:16:16 -0400 (0:00:00.300) 0:02:24.714 ********** ok: [managed-node8] => { "ansible_facts": { "storage_test_device_path": "/dev/sdb" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Monday 20 April 2026 16:16:16 -0400 (0:00:00.076) 0:02:24.791 ********** ok: [managed-node8] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Monday 20 April 2026 16:16:16 -0400 (0:00:00.077) 0:02:24.868 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Monday 20 April 2026 16:16:16 -0400 (0:00:00.096) 0:02:24.965 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Monday 20 April 2026 16:16:16 -0400 (0:00:00.093) 0:02:25.058 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Monday 20 April 2026 16:16:16 -0400 (0:00:00.067) 0:02:25.126 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Monday 20 April 2026 16:16:16 -0400 (0:00:00.120) 0:02:25.246 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Monday 20 April 2026 16:16:17 -0400 (0:00:00.160) 0:02:25.407 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Monday 20 April 2026 16:16:17 -0400 (0:00:00.115) 0:02:25.523 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Monday 20 April 2026 16:16:17 -0400 (0:00:00.104) 0:02:25.627 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Monday 20 April 2026 16:16:17 -0400 (0:00:00.085) 0:02:25.712 ********** ok: [managed-node8] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Monday 20 April 2026 16:16:17 -0400 (0:00:00.091) 0:02:25.805 ********** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Monday 20 April 2026 16:16:17 -0400 (0:00:00.287) 0:02:26.092 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Monday 20 April 2026 16:16:17 -0400 (0:00:00.122) 0:02:26.215 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Monday 20 April 2026 16:16:18 -0400 (0:00:00.073) 0:02:26.289 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Monday 20 April 2026 16:16:18 -0400 (0:00:00.028) 0:02:26.317 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Monday 20 April 2026 16:16:18 -0400 (0:00:00.036) 0:02:26.353 ********** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Monday 20 April 2026 16:16:18 -0400 (0:00:00.040) 0:02:26.394 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Monday 20 April 2026 16:16:18 -0400 (0:00:00.172) 0:02:26.566 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Monday 20 April 2026 16:16:18 -0400 (0:00:00.173) 0:02:26.740 ********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1776716166.8120904, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776716166.8120904, "dev": 6, "device_type": 2064, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 40051, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1776716166.8120904, "nlink": 1, "path": "/dev/sdb", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Monday 20 April 2026 16:16:19 -0400 (0:00:00.746) 0:02:27.486 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Monday 20 April 2026 16:16:19 -0400 (0:00:00.145) 0:02:27.631 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Monday 20 April 2026 16:16:19 -0400 (0:00:00.148) 0:02:27.780 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Monday 20 April 2026 16:16:19 -0400 (0:00:00.115) 0:02:27.895 ********** ok: [managed-node8] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Monday 20 April 2026 16:16:19 -0400 (0:00:00.176) 0:02:28.072 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Monday 20 April 2026 16:16:19 -0400 (0:00:00.150) 0:02:28.223 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Monday 20 April 2026 16:16:20 -0400 (0:00:00.108) 0:02:28.331 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Monday 20 April 2026 16:16:20 -0400 (0:00:00.062) 0:02:28.394 ********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Monday 20 April 2026 16:16:23 -0400 (0:00:03.209) 0:02:31.603 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Monday 20 April 2026 16:16:23 -0400 (0:00:00.099) 0:02:31.703 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Monday 20 April 2026 16:16:23 -0400 (0:00:00.157) 0:02:31.861 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Monday 20 April 2026 16:16:23 -0400 (0:00:00.084) 0:02:31.945 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Monday 20 April 2026 16:16:23 -0400 (0:00:00.122) 0:02:32.067 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Monday 20 April 2026 16:16:23 -0400 (0:00:00.144) 0:02:32.212 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Monday 20 April 2026 16:16:24 -0400 (0:00:00.186) 0:02:32.398 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Monday 20 April 2026 16:16:24 -0400 (0:00:00.184) 0:02:32.583 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Monday 20 April 2026 16:16:24 -0400 (0:00:00.082) 0:02:32.665 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Monday 20 April 2026 16:16:24 -0400 (0:00:00.134) 0:02:32.800 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Monday 20 April 2026 16:16:24 -0400 (0:00:00.108) 0:02:32.909 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Monday 20 April 2026 16:16:24 -0400 (0:00:00.063) 0:02:32.972 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Monday 20 April 2026 16:16:24 -0400 (0:00:00.076) 0:02:33.049 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Monday 20 April 2026 16:16:24 -0400 (0:00:00.102) 0:02:33.152 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Monday 20 April 2026 16:16:24 -0400 (0:00:00.100) 0:02:33.252 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Monday 20 April 2026 16:16:25 -0400 (0:00:00.133) 0:02:33.386 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Monday 20 April 2026 16:16:25 -0400 (0:00:00.096) 0:02:33.482 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Monday 20 April 2026 16:16:25 -0400 (0:00:00.084) 0:02:33.567 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Monday 20 April 2026 16:16:25 -0400 (0:00:00.134) 0:02:33.701 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Monday 20 April 2026 16:16:25 -0400 (0:00:00.109) 0:02:33.811 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Monday 20 April 2026 16:16:25 -0400 (0:00:00.121) 0:02:33.932 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Monday 20 April 2026 16:16:25 -0400 (0:00:00.121) 0:02:34.053 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Monday 20 April 2026 16:16:25 -0400 (0:00:00.085) 0:02:34.139 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Monday 20 April 2026 16:16:25 -0400 (0:00:00.084) 0:02:34.224 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Monday 20 April 2026 16:16:26 -0400 (0:00:00.104) 0:02:34.328 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Monday 20 April 2026 16:16:26 -0400 (0:00:00.141) 0:02:34.470 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Monday 20 April 2026 16:16:26 -0400 (0:00:00.144) 0:02:34.614 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Monday 20 April 2026 16:16:26 -0400 (0:00:00.098) 0:02:34.712 ********** ok: [managed-node8] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Monday 20 April 2026 16:16:26 -0400 (0:00:00.134) 0:02:34.847 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Monday 20 April 2026 16:16:26 -0400 (0:00:00.105) 0:02:34.953 ********** skipping: [managed-node8] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Monday 20 April 2026 16:16:26 -0400 (0:00:00.028) 0:02:34.981 ********** skipping: [managed-node8] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Monday 20 April 2026 16:16:26 -0400 (0:00:00.070) 0:02:35.052 ********** skipping: [managed-node8] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Monday 20 April 2026 16:16:26 -0400 (0:00:00.100) 0:02:35.153 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Monday 20 April 2026 16:16:26 -0400 (0:00:00.066) 0:02:35.219 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Monday 20 April 2026 16:16:27 -0400 (0:00:00.102) 0:02:35.321 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Monday 20 April 2026 16:16:27 -0400 (0:00:00.331) 0:02:35.652 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Monday 20 April 2026 16:16:27 -0400 (0:00:00.091) 0:02:35.744 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Monday 20 April 2026 16:16:27 -0400 (0:00:00.087) 0:02:35.831 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Monday 20 April 2026 16:16:27 -0400 (0:00:00.184) 0:02:36.015 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Monday 20 April 2026 16:16:27 -0400 (0:00:00.025) 0:02:36.041 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Monday 20 April 2026 16:16:27 -0400 (0:00:00.063) 0:02:36.105 ********** skipping: [managed-node8] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Monday 20 April 2026 16:16:27 -0400 (0:00:00.107) 0:02:36.212 ********** skipping: [managed-node8] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Monday 20 April 2026 16:16:28 -0400 (0:00:00.115) 0:02:36.327 ********** skipping: [managed-node8] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Monday 20 April 2026 16:16:28 -0400 (0:00:00.099) 0:02:36.427 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Monday 20 April 2026 16:16:28 -0400 (0:00:00.132) 0:02:36.560 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Monday 20 April 2026 16:16:28 -0400 (0:00:00.124) 0:02:36.684 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Monday 20 April 2026 16:16:28 -0400 (0:00:00.090) 0:02:36.774 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Monday 20 April 2026 16:16:28 -0400 (0:00:00.078) 0:02:36.852 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Monday 20 April 2026 16:16:28 -0400 (0:00:00.092) 0:02:36.945 ********** ok: [managed-node8] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Monday 20 April 2026 16:16:28 -0400 (0:00:00.122) 0:02:37.068 ********** ok: [managed-node8] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Monday 20 April 2026 16:16:28 -0400 (0:00:00.135) 0:02:37.204 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Monday 20 April 2026 16:16:29 -0400 (0:00:00.083) 0:02:37.287 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Monday 20 April 2026 16:16:29 -0400 (0:00:00.111) 0:02:37.398 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Monday 20 April 2026 16:16:29 -0400 (0:00:00.144) 0:02:37.543 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Monday 20 April 2026 16:16:29 -0400 (0:00:00.136) 0:02:37.679 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Monday 20 April 2026 16:16:29 -0400 (0:00:00.162) 0:02:37.842 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Monday 20 April 2026 16:16:29 -0400 (0:00:00.141) 0:02:37.983 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Monday 20 April 2026 16:16:29 -0400 (0:00:00.198) 0:02:38.182 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Monday 20 April 2026 16:16:30 -0400 (0:00:00.120) 0:02:38.302 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Monday 20 April 2026 16:16:30 -0400 (0:00:00.095) 0:02:38.398 ********** ok: [managed-node8] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Change the disk device file system type from swap to ext3] *************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_swap.yml:54 Monday 20 April 2026 16:16:30 -0400 (0:00:00.107) 0:02:38.505 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node8 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 16:16:30 -0400 (0:00:00.270) 0:02:38.776 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 16:16:30 -0400 (0:00:00.169) 0:02:38.945 ********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 16:16:30 -0400 (0:00:00.193) 0:02:39.138 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 16:16:31 -0400 (0:00:00.124) 0:02:39.263 ********** ok: [managed-node8] TASK [fedora.linux_system_roles.storage : Record role begin fingerprint] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 16:16:32 -0400 (0:00:01.459) 0:02:40.722 ********** ok: [managed-node8] => { "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:13 Monday 20 April 2026 16:16:33 -0400 (0:00:00.840) 0:02:41.563 ********** skipping: [managed-node8] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node8] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node8] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node8] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:31 Monday 20 April 2026 16:16:33 -0400 (0:00:00.176) 0:02:41.740 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:36 Monday 20 April 2026 16:16:33 -0400 (0:00:00.109) 0:02:41.850 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 20 April 2026 16:16:33 -0400 (0:00:00.090) 0:02:41.940 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 16:16:33 -0400 (0:00:00.054) 0:02:41.994 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 16:16:33 -0400 (0:00:00.068) 0:02:42.062 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 16:16:34 -0400 (0:00:00.212) 0:02:42.274 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 16:16:34 -0400 (0:00:00.178) 0:02:42.453 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 16:16:34 -0400 (0:00:00.136) 0:02:42.589 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 16:16:34 -0400 (0:00:00.141) 0:02:42.730 ********** ok: [managed-node8] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 16:16:34 -0400 (0:00:00.111) 0:02:42.841 ********** ok: [managed-node8] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "fs_type": "ext3", "mount_point": "/opt/test", "name": "test1", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 16:16:34 -0400 (0:00:00.175) 0:02:43.017 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 16:16:34 -0400 (0:00:00.109) 0:02:43.126 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 16:16:35 -0400 (0:00:00.192) 0:02:43.318 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 16:16:35 -0400 (0:00:00.154) 0:02:43.472 ********** ok: [managed-node8] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 16:16:37 -0400 (0:00:02.175) 0:02:45.648 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 16:16:37 -0400 (0:00:00.172) 0:02:45.820 ********** changed: [managed-node8] => { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "swap" }, { "action": "create format", "device": "/dev/sda", "fs_type": "ext3" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "swap", "path": "none", "src": "UUID=7186789c-f155-4eca-9acb-da069dd36134", "state": "absent" }, { "dump": 0, "fstype": "ext3", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test", "src": "UUID=8688be8d-cb24-47e7-b41d-ab163b48e7dd", "state": "mounted" } ], "packages": [ "xfsprogs", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=8688be8d-cb24-47e7-b41d-ab163b48e7dd", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext3", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Monday 20 April 2026 16:16:45 -0400 (0:00:07.829) 0:02:53.649 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Monday 20 April 2026 16:16:45 -0400 (0:00:00.033) 0:02:53.683 ********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1776716116.7909212, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "15d4492999e2e8a58c3b65212ef474e02eba5ec1", "ctime": 1776716115.1919158, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 142606535, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776716115.1919158, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1408, "uid": 0, "version": "3187282743", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 20 April 2026 16:16:45 -0400 (0:00:00.425) 0:02:54.108 ********** ok: [managed-node8] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 16:16:46 -0400 (0:00:00.635) 0:02:54.744 ********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Monday 20 April 2026 16:16:46 -0400 (0:00:00.042) 0:02:54.786 ********** ok: [managed-node8] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "swap" }, { "action": "create format", "device": "/dev/sda", "fs_type": "ext3" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "swap", "path": "none", "src": "UUID=7186789c-f155-4eca-9acb-da069dd36134", "state": "absent" }, { "dump": 0, "fstype": "ext3", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test", "src": "UUID=8688be8d-cb24-47e7-b41d-ab163b48e7dd", "state": "mounted" } ], "packages": [ "xfsprogs", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=8688be8d-cb24-47e7-b41d-ab163b48e7dd", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext3", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Monday 20 April 2026 16:16:46 -0400 (0:00:00.047) 0:02:54.833 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Monday 20 April 2026 16:16:46 -0400 (0:00:00.049) 0:02:54.883 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=8688be8d-cb24-47e7-b41d-ab163b48e7dd", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext3", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Monday 20 April 2026 16:16:46 -0400 (0:00:00.041) 0:02:54.924 ********** changed: [managed-node8] => (item={'src': 'UUID=7186789c-f155-4eca-9acb-da069dd36134', 'path': 'none', 'state': 'absent', 'fstype': 'swap'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "swap", "mount_info": { "fstype": "swap", "path": "none", "src": "UUID=7186789c-f155-4eca-9acb-da069dd36134", "state": "absent" }, "name": "none", "opts": "defaults", "passno": "0", "src": "UUID=7186789c-f155-4eca-9acb-da069dd36134" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Monday 20 April 2026 16:16:47 -0400 (0:00:00.649) 0:02:55.574 ********** ok: [managed-node8] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Monday 20 April 2026 16:16:48 -0400 (0:00:00.985) 0:02:56.560 ********** changed: [managed-node8] => (item={'src': 'UUID=8688be8d-cb24-47e7-b41d-ab163b48e7dd', 'path': '/opt/test', 'fstype': 'ext3', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext3", "mount_info": { "dump": 0, "fstype": "ext3", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test", "src": "UUID=8688be8d-cb24-47e7-b41d-ab163b48e7dd", "state": "mounted" }, "name": "/opt/test", "opts": "defaults", "passno": "0", "src": "UUID=8688be8d-cb24-47e7-b41d-ab163b48e7dd" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Monday 20 April 2026 16:16:49 -0400 (0:00:00.751) 0:02:57.312 ********** skipping: [managed-node8] => (item={'src': 'UUID=8688be8d-cb24-47e7-b41d-ab163b48e7dd', 'path': '/opt/test', 'fstype': 'ext3', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "ext3", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test", "src": "UUID=8688be8d-cb24-47e7-b41d-ab163b48e7dd", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Monday 20 April 2026 16:16:49 -0400 (0:00:00.133) 0:02:57.445 ********** ok: [managed-node8] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Monday 20 April 2026 16:16:50 -0400 (0:00:00.968) 0:02:58.414 ********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1776713110.4423337, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1716968941.893, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 135, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1716968586.525, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "1157759751", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Monday 20 April 2026 16:16:50 -0400 (0:00:00.639) 0:02:59.054 ********** TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Monday 20 April 2026 16:16:50 -0400 (0:00:00.061) 0:02:59.115 ********** ok: [managed-node8] TASK [fedora.linux_system_roles.storage : Record role success fingerprint] ***** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:16 Monday 20 April 2026 16:16:52 -0400 (0:00:01.179) 0:03:00.294 ********** ok: [managed-node8] => { "changed": false } TASK [Verify results - 3] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_swap.yml:64 Monday 20 April 2026 16:16:52 -0400 (0:00:00.840) 0:03:01.135 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node8 TASK [Print out pool information] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Monday 20 April 2026 16:16:53 -0400 (0:00:00.207) 0:03:01.342 ********** skipping: [managed-node8] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Monday 20 April 2026 16:16:53 -0400 (0:00:00.121) 0:03:01.463 ********** ok: [managed-node8] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=8688be8d-cb24-47e7-b41d-ab163b48e7dd", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext3", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Monday 20 April 2026 16:16:53 -0400 (0:00:00.144) 0:03:01.608 ********** ok: [managed-node8] => { "changed": false, "info": { "/dev/sda": { "fstype": "ext3", "label": "", "mountpoint": "/opt/test", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "8688be8d-cb24-47e7-b41d-ab163b48e7dd" }, "/dev/sdb": { "fstype": "ext3", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "61b4fe33-23e6-4591-8ccf-71b352914ac9" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Monday 20 April 2026 16:16:54 -0400 (0:00:00.775) 0:03:02.384 ********** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002638", "end": "2026-04-20 16:16:54.844623", "rc": 0, "start": "2026-04-20 16:16:54.841985" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 UUID=8688be8d-cb24-47e7-b41d-ab163b48e7dd /opt/test ext3 defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Monday 20 April 2026 16:16:54 -0400 (0:00:00.834) 0:03:03.218 ********** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002239", "end": "2026-04-20 16:16:55.499522", "failed_when_result": false, "rc": 0, "start": "2026-04-20 16:16:55.497283" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Monday 20 April 2026 16:16:55 -0400 (0:00:00.645) 0:03:03.864 ********** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Monday 20 April 2026 16:16:55 -0400 (0:00:00.041) 0:03:03.905 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node8 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Monday 20 April 2026 16:16:55 -0400 (0:00:00.092) 0:03:03.997 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Monday 20 April 2026 16:16:55 -0400 (0:00:00.075) 0:03:04.073 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node8 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node8 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node8 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node8 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node8 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node8 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node8 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node8 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Monday 20 April 2026 16:16:56 -0400 (0:00:00.266) 0:03:04.339 ********** ok: [managed-node8] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Monday 20 April 2026 16:16:56 -0400 (0:00:00.074) 0:03:04.414 ********** ok: [managed-node8] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Monday 20 April 2026 16:16:56 -0400 (0:00:00.093) 0:03:04.507 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Monday 20 April 2026 16:16:56 -0400 (0:00:00.102) 0:03:04.609 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Monday 20 April 2026 16:16:56 -0400 (0:00:00.105) 0:03:04.715 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Monday 20 April 2026 16:16:56 -0400 (0:00:00.121) 0:03:04.837 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Monday 20 April 2026 16:16:56 -0400 (0:00:00.095) 0:03:04.933 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Monday 20 April 2026 16:16:56 -0400 (0:00:00.181) 0:03:05.114 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Monday 20 April 2026 16:16:56 -0400 (0:00:00.072) 0:03:05.187 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Monday 20 April 2026 16:16:57 -0400 (0:00:00.112) 0:03:05.299 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Monday 20 April 2026 16:16:57 -0400 (0:00:00.160) 0:03:05.460 ********** ok: [managed-node8] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Monday 20 April 2026 16:16:57 -0400 (0:00:00.063) 0:03:05.523 ********** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=8688be8d-cb24-47e7-b41d-ab163b48e7dd " ], "storage_test_fstab_mount_options_matches": [ " /opt/test ext3 defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Monday 20 April 2026 16:16:57 -0400 (0:00:00.237) 0:03:05.760 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Monday 20 April 2026 16:16:57 -0400 (0:00:00.157) 0:03:05.918 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Monday 20 April 2026 16:16:57 -0400 (0:00:00.090) 0:03:06.008 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Monday 20 April 2026 16:16:57 -0400 (0:00:00.188) 0:03:06.196 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Monday 20 April 2026 16:16:58 -0400 (0:00:00.113) 0:03:06.310 ********** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Monday 20 April 2026 16:16:58 -0400 (0:00:00.100) 0:03:06.411 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Monday 20 April 2026 16:16:58 -0400 (0:00:00.200) 0:03:06.612 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Monday 20 April 2026 16:16:58 -0400 (0:00:00.185) 0:03:06.798 ********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1776716205.2872205, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776716205.2872205, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 40009, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1776716205.2872205, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Monday 20 April 2026 16:16:59 -0400 (0:00:00.779) 0:03:07.577 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Monday 20 April 2026 16:16:59 -0400 (0:00:00.081) 0:03:07.658 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Monday 20 April 2026 16:16:59 -0400 (0:00:00.081) 0:03:07.740 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Monday 20 April 2026 16:16:59 -0400 (0:00:00.076) 0:03:07.816 ********** ok: [managed-node8] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Monday 20 April 2026 16:16:59 -0400 (0:00:00.094) 0:03:07.911 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Monday 20 April 2026 16:16:59 -0400 (0:00:00.049) 0:03:07.961 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Monday 20 April 2026 16:16:59 -0400 (0:00:00.099) 0:03:08.060 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Monday 20 April 2026 16:16:59 -0400 (0:00:00.044) 0:03:08.104 ********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Monday 20 April 2026 16:17:02 -0400 (0:00:02.759) 0:03:10.864 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Monday 20 April 2026 16:17:02 -0400 (0:00:00.029) 0:03:10.894 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Monday 20 April 2026 16:17:02 -0400 (0:00:00.027) 0:03:10.921 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Monday 20 April 2026 16:17:02 -0400 (0:00:00.042) 0:03:10.964 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Monday 20 April 2026 16:17:02 -0400 (0:00:00.029) 0:03:10.993 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Monday 20 April 2026 16:17:02 -0400 (0:00:00.031) 0:03:11.025 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Monday 20 April 2026 16:17:02 -0400 (0:00:00.028) 0:03:11.053 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Monday 20 April 2026 16:17:02 -0400 (0:00:00.019) 0:03:11.073 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Monday 20 April 2026 16:17:02 -0400 (0:00:00.029) 0:03:11.102 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Monday 20 April 2026 16:17:02 -0400 (0:00:00.038) 0:03:11.141 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Monday 20 April 2026 16:17:02 -0400 (0:00:00.043) 0:03:11.185 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Monday 20 April 2026 16:17:02 -0400 (0:00:00.032) 0:03:11.217 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Monday 20 April 2026 16:17:02 -0400 (0:00:00.027) 0:03:11.244 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Monday 20 April 2026 16:17:03 -0400 (0:00:00.039) 0:03:11.284 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Monday 20 April 2026 16:17:03 -0400 (0:00:00.026) 0:03:11.311 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Monday 20 April 2026 16:17:03 -0400 (0:00:00.038) 0:03:11.349 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Monday 20 April 2026 16:17:03 -0400 (0:00:00.041) 0:03:11.391 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Monday 20 April 2026 16:17:03 -0400 (0:00:00.041) 0:03:11.432 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Monday 20 April 2026 16:17:03 -0400 (0:00:00.044) 0:03:11.477 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Monday 20 April 2026 16:17:03 -0400 (0:00:00.045) 0:03:11.523 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Monday 20 April 2026 16:17:03 -0400 (0:00:00.030) 0:03:11.554 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Monday 20 April 2026 16:17:03 -0400 (0:00:00.081) 0:03:11.635 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Monday 20 April 2026 16:17:03 -0400 (0:00:00.040) 0:03:11.676 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Monday 20 April 2026 16:17:03 -0400 (0:00:00.047) 0:03:11.724 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Monday 20 April 2026 16:17:03 -0400 (0:00:00.085) 0:03:11.809 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Monday 20 April 2026 16:17:03 -0400 (0:00:00.037) 0:03:11.847 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Monday 20 April 2026 16:17:03 -0400 (0:00:00.107) 0:03:11.955 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Monday 20 April 2026 16:17:03 -0400 (0:00:00.053) 0:03:12.008 ********** ok: [managed-node8] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Monday 20 April 2026 16:17:03 -0400 (0:00:00.034) 0:03:12.043 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Monday 20 April 2026 16:17:03 -0400 (0:00:00.048) 0:03:12.091 ********** skipping: [managed-node8] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Monday 20 April 2026 16:17:03 -0400 (0:00:00.091) 0:03:12.182 ********** skipping: [managed-node8] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Monday 20 April 2026 16:17:03 -0400 (0:00:00.062) 0:03:12.245 ********** skipping: [managed-node8] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Monday 20 April 2026 16:17:04 -0400 (0:00:00.048) 0:03:12.293 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Monday 20 April 2026 16:17:04 -0400 (0:00:00.088) 0:03:12.382 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Monday 20 April 2026 16:17:04 -0400 (0:00:00.060) 0:03:12.442 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Monday 20 April 2026 16:17:04 -0400 (0:00:00.027) 0:03:12.470 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Monday 20 April 2026 16:17:04 -0400 (0:00:00.049) 0:03:12.519 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Monday 20 April 2026 16:17:04 -0400 (0:00:00.075) 0:03:12.594 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Monday 20 April 2026 16:17:04 -0400 (0:00:00.056) 0:03:12.651 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Monday 20 April 2026 16:17:04 -0400 (0:00:00.058) 0:03:12.709 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Monday 20 April 2026 16:17:04 -0400 (0:00:00.131) 0:03:12.841 ********** skipping: [managed-node8] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Monday 20 April 2026 16:17:04 -0400 (0:00:00.064) 0:03:12.906 ********** skipping: [managed-node8] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Monday 20 April 2026 16:17:04 -0400 (0:00:00.056) 0:03:12.962 ********** skipping: [managed-node8] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Monday 20 April 2026 16:17:04 -0400 (0:00:00.094) 0:03:13.057 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Monday 20 April 2026 16:17:04 -0400 (0:00:00.052) 0:03:13.109 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Monday 20 April 2026 16:17:04 -0400 (0:00:00.030) 0:03:13.140 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Monday 20 April 2026 16:17:04 -0400 (0:00:00.051) 0:03:13.191 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Monday 20 April 2026 16:17:05 -0400 (0:00:00.080) 0:03:13.272 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Monday 20 April 2026 16:17:05 -0400 (0:00:00.055) 0:03:13.328 ********** ok: [managed-node8] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Monday 20 April 2026 16:17:05 -0400 (0:00:00.065) 0:03:13.394 ********** ok: [managed-node8] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Monday 20 April 2026 16:17:05 -0400 (0:00:00.126) 0:03:13.521 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Monday 20 April 2026 16:17:05 -0400 (0:00:00.104) 0:03:13.625 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Monday 20 April 2026 16:17:05 -0400 (0:00:00.125) 0:03:13.750 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Monday 20 April 2026 16:17:05 -0400 (0:00:00.065) 0:03:13.816 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Monday 20 April 2026 16:17:05 -0400 (0:00:00.110) 0:03:13.927 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Monday 20 April 2026 16:17:05 -0400 (0:00:00.095) 0:03:14.023 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Monday 20 April 2026 16:17:05 -0400 (0:00:00.104) 0:03:14.128 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Monday 20 April 2026 16:17:06 -0400 (0:00:00.130) 0:03:14.259 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Monday 20 April 2026 16:17:06 -0400 (0:00:00.091) 0:03:14.351 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Monday 20 April 2026 16:17:06 -0400 (0:00:00.053) 0:03:14.404 ********** ok: [managed-node8] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Repeat the previous invocation to verify idempotence] ******************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_swap.yml:67 Monday 20 April 2026 16:17:06 -0400 (0:00:00.119) 0:03:14.523 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node8 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 16:17:06 -0400 (0:00:00.121) 0:03:14.645 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 16:17:06 -0400 (0:00:00.053) 0:03:14.698 ********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 16:17:06 -0400 (0:00:00.070) 0:03:14.769 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 16:17:06 -0400 (0:00:00.048) 0:03:14.817 ********** ok: [managed-node8] TASK [fedora.linux_system_roles.storage : Record role begin fingerprint] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 16:17:07 -0400 (0:00:01.230) 0:03:16.048 ********** ok: [managed-node8] => { "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:13 Monday 20 April 2026 16:17:08 -0400 (0:00:00.511) 0:03:16.559 ********** skipping: [managed-node8] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node8] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node8] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node8] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:31 Monday 20 April 2026 16:17:08 -0400 (0:00:00.086) 0:03:16.645 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:36 Monday 20 April 2026 16:17:08 -0400 (0:00:00.034) 0:03:16.679 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 20 April 2026 16:17:08 -0400 (0:00:00.048) 0:03:16.728 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 16:17:08 -0400 (0:00:00.077) 0:03:16.805 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 16:17:08 -0400 (0:00:00.093) 0:03:16.899 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 16:17:08 -0400 (0:00:00.145) 0:03:17.045 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 16:17:08 -0400 (0:00:00.088) 0:03:17.134 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 16:17:08 -0400 (0:00:00.059) 0:03:17.193 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 16:17:09 -0400 (0:00:00.074) 0:03:17.267 ********** ok: [managed-node8] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 16:17:09 -0400 (0:00:00.067) 0:03:17.334 ********** ok: [managed-node8] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "fs_type": "ext3", "mount_point": "/opt/test", "name": "test1", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 16:17:09 -0400 (0:00:00.067) 0:03:17.402 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 16:17:09 -0400 (0:00:00.069) 0:03:17.471 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 16:17:09 -0400 (0:00:00.080) 0:03:17.551 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 16:17:09 -0400 (0:00:00.078) 0:03:17.630 ********** ok: [managed-node8] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 16:17:11 -0400 (0:00:01.999) 0:03:19.629 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 16:17:11 -0400 (0:00:00.031) 0:03:19.661 ********** ok: [managed-node8] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "dump": 0, "fstype": "ext3", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test", "src": "UUID=8688be8d-cb24-47e7-b41d-ab163b48e7dd", "state": "mounted" } ], "packages": [ "xfsprogs", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=8688be8d-cb24-47e7-b41d-ab163b48e7dd", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext3", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Monday 20 April 2026 16:17:15 -0400 (0:00:04.128) 0:03:23.789 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Monday 20 April 2026 16:17:15 -0400 (0:00:00.041) 0:03:23.831 ********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1776716208.9072328, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "881420d87525fee8290f16a81681f44134ef919a", "ctime": 1776716208.9042327, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 142606535, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776716208.9042327, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1413, "uid": 0, "version": "3187282743", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 20 April 2026 16:17:16 -0400 (0:00:00.437) 0:03:24.269 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 16:17:16 -0400 (0:00:00.035) 0:03:24.304 ********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Monday 20 April 2026 16:17:16 -0400 (0:00:00.114) 0:03:24.419 ********** ok: [managed-node8] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "dump": 0, "fstype": "ext3", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test", "src": "UUID=8688be8d-cb24-47e7-b41d-ab163b48e7dd", "state": "mounted" } ], "packages": [ "xfsprogs", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=8688be8d-cb24-47e7-b41d-ab163b48e7dd", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext3", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Monday 20 April 2026 16:17:16 -0400 (0:00:00.067) 0:03:24.486 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Monday 20 April 2026 16:17:16 -0400 (0:00:00.136) 0:03:24.623 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=8688be8d-cb24-47e7-b41d-ab163b48e7dd", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext3", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Monday 20 April 2026 16:17:16 -0400 (0:00:00.097) 0:03:24.720 ********** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Monday 20 April 2026 16:17:16 -0400 (0:00:00.061) 0:03:24.782 ********** ok: [managed-node8] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Monday 20 April 2026 16:17:17 -0400 (0:00:00.906) 0:03:25.689 ********** ok: [managed-node8] => (item={'src': 'UUID=8688be8d-cb24-47e7-b41d-ab163b48e7dd', 'path': '/opt/test', 'fstype': 'ext3', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext3", "mount_info": { "dump": 0, "fstype": "ext3", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test", "src": "UUID=8688be8d-cb24-47e7-b41d-ab163b48e7dd", "state": "mounted" }, "name": "/opt/test", "opts": "defaults", "passno": "0", "src": "UUID=8688be8d-cb24-47e7-b41d-ab163b48e7dd" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Monday 20 April 2026 16:17:18 -0400 (0:00:00.824) 0:03:26.514 ********** skipping: [managed-node8] => (item={'src': 'UUID=8688be8d-cb24-47e7-b41d-ab163b48e7dd', 'path': '/opt/test', 'fstype': 'ext3', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "ext3", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test", "src": "UUID=8688be8d-cb24-47e7-b41d-ab163b48e7dd", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Monday 20 April 2026 16:17:18 -0400 (0:00:00.092) 0:03:26.606 ********** ok: [managed-node8] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Monday 20 April 2026 16:17:19 -0400 (0:00:00.890) 0:03:27.497 ********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1776713110.4423337, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1716968941.893, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 135, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1716968586.525, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "1157759751", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Monday 20 April 2026 16:17:19 -0400 (0:00:00.697) 0:03:28.195 ********** TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Monday 20 April 2026 16:17:20 -0400 (0:00:00.101) 0:03:28.296 ********** ok: [managed-node8] TASK [fedora.linux_system_roles.storage : Record role success fingerprint] ***** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:16 Monday 20 April 2026 16:17:21 -0400 (0:00:00.997) 0:03:29.293 ********** ok: [managed-node8] => { "changed": false } TASK [Verify results - 4] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_swap.yml:77 Monday 20 April 2026 16:17:21 -0400 (0:00:00.614) 0:03:29.908 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node8 TASK [Print out pool information] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Monday 20 April 2026 16:17:21 -0400 (0:00:00.222) 0:03:30.130 ********** skipping: [managed-node8] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Monday 20 April 2026 16:17:21 -0400 (0:00:00.064) 0:03:30.195 ********** ok: [managed-node8] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=8688be8d-cb24-47e7-b41d-ab163b48e7dd", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext3", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Monday 20 April 2026 16:17:22 -0400 (0:00:00.090) 0:03:30.285 ********** ok: [managed-node8] => { "changed": false, "info": { "/dev/sda": { "fstype": "ext3", "label": "", "mountpoint": "/opt/test", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "8688be8d-cb24-47e7-b41d-ab163b48e7dd" }, "/dev/sdb": { "fstype": "ext3", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "61b4fe33-23e6-4591-8ccf-71b352914ac9" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Monday 20 April 2026 16:17:22 -0400 (0:00:00.859) 0:03:31.145 ********** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003200", "end": "2026-04-20 16:17:23.508633", "rc": 0, "start": "2026-04-20 16:17:23.505433" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 UUID=8688be8d-cb24-47e7-b41d-ab163b48e7dd /opt/test ext3 defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Monday 20 April 2026 16:17:23 -0400 (0:00:00.804) 0:03:31.949 ********** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002277", "end": "2026-04-20 16:17:24.411235", "failed_when_result": false, "rc": 0, "start": "2026-04-20 16:17:24.408958" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Monday 20 April 2026 16:17:24 -0400 (0:00:00.857) 0:03:32.806 ********** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Monday 20 April 2026 16:17:24 -0400 (0:00:00.079) 0:03:32.885 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node8 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Monday 20 April 2026 16:17:24 -0400 (0:00:00.124) 0:03:33.010 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Monday 20 April 2026 16:17:24 -0400 (0:00:00.093) 0:03:33.104 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node8 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node8 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node8 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node8 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node8 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node8 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node8 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node8 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Monday 20 April 2026 16:17:25 -0400 (0:00:00.350) 0:03:33.454 ********** ok: [managed-node8] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Monday 20 April 2026 16:17:25 -0400 (0:00:00.051) 0:03:33.506 ********** ok: [managed-node8] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Monday 20 April 2026 16:17:25 -0400 (0:00:00.072) 0:03:33.579 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Monday 20 April 2026 16:17:25 -0400 (0:00:00.089) 0:03:33.669 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Monday 20 April 2026 16:17:25 -0400 (0:00:00.100) 0:03:33.769 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Monday 20 April 2026 16:17:25 -0400 (0:00:00.104) 0:03:33.874 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Monday 20 April 2026 16:17:25 -0400 (0:00:00.083) 0:03:33.958 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Monday 20 April 2026 16:17:25 -0400 (0:00:00.200) 0:03:34.158 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Monday 20 April 2026 16:17:25 -0400 (0:00:00.027) 0:03:34.186 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Monday 20 April 2026 16:17:25 -0400 (0:00:00.035) 0:03:34.222 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Monday 20 April 2026 16:17:26 -0400 (0:00:00.044) 0:03:34.266 ********** ok: [managed-node8] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Monday 20 April 2026 16:17:26 -0400 (0:00:00.052) 0:03:34.319 ********** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=8688be8d-cb24-47e7-b41d-ab163b48e7dd " ], "storage_test_fstab_mount_options_matches": [ " /opt/test ext3 defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Monday 20 April 2026 16:17:26 -0400 (0:00:00.063) 0:03:34.383 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Monday 20 April 2026 16:17:26 -0400 (0:00:00.099) 0:03:34.482 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Monday 20 April 2026 16:17:26 -0400 (0:00:00.128) 0:03:34.611 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Monday 20 April 2026 16:17:26 -0400 (0:00:00.125) 0:03:34.737 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Monday 20 April 2026 16:17:26 -0400 (0:00:00.143) 0:03:34.881 ********** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Monday 20 April 2026 16:17:26 -0400 (0:00:00.101) 0:03:34.982 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Monday 20 April 2026 16:17:26 -0400 (0:00:00.071) 0:03:35.053 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Monday 20 April 2026 16:17:26 -0400 (0:00:00.152) 0:03:35.206 ********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1776716205.2872205, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776716205.2872205, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 40009, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1776716205.2872205, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Monday 20 April 2026 16:17:27 -0400 (0:00:00.864) 0:03:36.071 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Monday 20 April 2026 16:17:27 -0400 (0:00:00.180) 0:03:36.251 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Monday 20 April 2026 16:17:28 -0400 (0:00:00.127) 0:03:36.379 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Monday 20 April 2026 16:17:28 -0400 (0:00:00.187) 0:03:36.567 ********** ok: [managed-node8] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Monday 20 April 2026 16:17:28 -0400 (0:00:00.075) 0:03:36.642 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Monday 20 April 2026 16:17:28 -0400 (0:00:00.137) 0:03:36.779 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Monday 20 April 2026 16:17:28 -0400 (0:00:00.095) 0:03:36.875 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Monday 20 April 2026 16:17:28 -0400 (0:00:00.122) 0:03:36.998 ********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Monday 20 April 2026 16:17:32 -0400 (0:00:03.263) 0:03:40.261 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Monday 20 April 2026 16:17:32 -0400 (0:00:00.074) 0:03:40.335 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Monday 20 April 2026 16:17:32 -0400 (0:00:00.030) 0:03:40.366 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Monday 20 April 2026 16:17:32 -0400 (0:00:00.057) 0:03:40.424 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Monday 20 April 2026 16:17:32 -0400 (0:00:00.030) 0:03:40.454 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Monday 20 April 2026 16:17:32 -0400 (0:00:00.060) 0:03:40.515 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Monday 20 April 2026 16:17:32 -0400 (0:00:00.042) 0:03:40.557 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Monday 20 April 2026 16:17:32 -0400 (0:00:00.045) 0:03:40.602 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Monday 20 April 2026 16:17:32 -0400 (0:00:00.051) 0:03:40.654 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Monday 20 April 2026 16:17:32 -0400 (0:00:00.074) 0:03:40.728 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Monday 20 April 2026 16:17:32 -0400 (0:00:00.047) 0:03:40.776 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Monday 20 April 2026 16:17:32 -0400 (0:00:00.053) 0:03:40.829 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Monday 20 April 2026 16:17:32 -0400 (0:00:00.038) 0:03:40.868 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Monday 20 April 2026 16:17:32 -0400 (0:00:00.031) 0:03:40.899 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Monday 20 April 2026 16:17:32 -0400 (0:00:00.051) 0:03:40.951 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Monday 20 April 2026 16:17:32 -0400 (0:00:00.081) 0:03:41.033 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Monday 20 April 2026 16:17:32 -0400 (0:00:00.055) 0:03:41.088 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Monday 20 April 2026 16:17:32 -0400 (0:00:00.031) 0:03:41.119 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Monday 20 April 2026 16:17:32 -0400 (0:00:00.030) 0:03:41.149 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Monday 20 April 2026 16:17:32 -0400 (0:00:00.029) 0:03:41.179 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Monday 20 April 2026 16:17:32 -0400 (0:00:00.030) 0:03:41.210 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Monday 20 April 2026 16:17:33 -0400 (0:00:00.044) 0:03:41.255 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Monday 20 April 2026 16:17:33 -0400 (0:00:00.056) 0:03:41.311 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Monday 20 April 2026 16:17:33 -0400 (0:00:00.057) 0:03:41.368 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Monday 20 April 2026 16:17:33 -0400 (0:00:00.028) 0:03:41.397 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Monday 20 April 2026 16:17:33 -0400 (0:00:00.084) 0:03:41.481 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Monday 20 April 2026 16:17:33 -0400 (0:00:00.056) 0:03:41.537 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Monday 20 April 2026 16:17:33 -0400 (0:00:00.052) 0:03:41.590 ********** ok: [managed-node8] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Monday 20 April 2026 16:17:33 -0400 (0:00:00.029) 0:03:41.619 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Monday 20 April 2026 16:17:33 -0400 (0:00:00.059) 0:03:41.679 ********** skipping: [managed-node8] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Monday 20 April 2026 16:17:33 -0400 (0:00:00.055) 0:03:41.734 ********** skipping: [managed-node8] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Monday 20 April 2026 16:17:33 -0400 (0:00:00.055) 0:03:41.789 ********** skipping: [managed-node8] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Monday 20 April 2026 16:17:33 -0400 (0:00:00.053) 0:03:41.843 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Monday 20 April 2026 16:17:33 -0400 (0:00:00.035) 0:03:41.878 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Monday 20 April 2026 16:17:33 -0400 (0:00:00.042) 0:03:41.921 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Monday 20 April 2026 16:17:33 -0400 (0:00:00.049) 0:03:41.971 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Monday 20 April 2026 16:17:33 -0400 (0:00:00.054) 0:03:42.026 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Monday 20 April 2026 16:17:33 -0400 (0:00:00.095) 0:03:42.121 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Monday 20 April 2026 16:17:33 -0400 (0:00:00.063) 0:03:42.185 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Monday 20 April 2026 16:17:34 -0400 (0:00:00.080) 0:03:42.265 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Monday 20 April 2026 16:17:34 -0400 (0:00:00.047) 0:03:42.312 ********** skipping: [managed-node8] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Monday 20 April 2026 16:17:34 -0400 (0:00:00.061) 0:03:42.374 ********** skipping: [managed-node8] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Monday 20 April 2026 16:17:34 -0400 (0:00:00.070) 0:03:42.445 ********** skipping: [managed-node8] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Monday 20 April 2026 16:17:34 -0400 (0:00:00.064) 0:03:42.509 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Monday 20 April 2026 16:17:34 -0400 (0:00:00.113) 0:03:42.623 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Monday 20 April 2026 16:17:34 -0400 (0:00:00.169) 0:03:42.792 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Monday 20 April 2026 16:17:34 -0400 (0:00:00.127) 0:03:42.920 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Monday 20 April 2026 16:17:34 -0400 (0:00:00.073) 0:03:42.993 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Monday 20 April 2026 16:17:34 -0400 (0:00:00.114) 0:03:43.108 ********** ok: [managed-node8] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Monday 20 April 2026 16:17:34 -0400 (0:00:00.116) 0:03:43.224 ********** ok: [managed-node8] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Monday 20 April 2026 16:17:35 -0400 (0:00:00.087) 0:03:43.312 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Monday 20 April 2026 16:17:35 -0400 (0:00:00.111) 0:03:43.423 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Monday 20 April 2026 16:17:35 -0400 (0:00:00.090) 0:03:43.513 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Monday 20 April 2026 16:17:35 -0400 (0:00:00.112) 0:03:43.626 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Monday 20 April 2026 16:17:35 -0400 (0:00:00.112) 0:03:43.738 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Monday 20 April 2026 16:17:35 -0400 (0:00:00.102) 0:03:43.840 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Monday 20 April 2026 16:17:35 -0400 (0:00:00.124) 0:03:43.965 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Monday 20 April 2026 16:17:35 -0400 (0:00:00.122) 0:03:44.088 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Monday 20 April 2026 16:17:35 -0400 (0:00:00.070) 0:03:44.158 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Monday 20 April 2026 16:17:35 -0400 (0:00:00.080) 0:03:44.239 ********** ok: [managed-node8] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Change it back to swap] ************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_swap.yml:80 Monday 20 April 2026 16:17:36 -0400 (0:00:00.092) 0:03:44.331 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node8 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 16:17:36 -0400 (0:00:00.192) 0:03:44.523 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 16:17:36 -0400 (0:00:00.065) 0:03:44.589 ********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 16:17:36 -0400 (0:00:00.077) 0:03:44.666 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 16:17:36 -0400 (0:00:00.108) 0:03:44.774 ********** ok: [managed-node8] TASK [fedora.linux_system_roles.storage : Record role begin fingerprint] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 16:17:38 -0400 (0:00:01.524) 0:03:46.299 ********** ok: [managed-node8] => { "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:13 Monday 20 April 2026 16:17:38 -0400 (0:00:00.768) 0:03:47.067 ********** skipping: [managed-node8] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node8] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node8] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node8] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:31 Monday 20 April 2026 16:17:39 -0400 (0:00:00.235) 0:03:47.303 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:36 Monday 20 April 2026 16:17:39 -0400 (0:00:00.069) 0:03:47.372 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 20 April 2026 16:17:39 -0400 (0:00:00.098) 0:03:47.471 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 16:17:39 -0400 (0:00:00.067) 0:03:47.538 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 16:17:39 -0400 (0:00:00.069) 0:03:47.607 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 16:17:39 -0400 (0:00:00.180) 0:03:47.788 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 16:17:39 -0400 (0:00:00.067) 0:03:47.855 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 16:17:39 -0400 (0:00:00.099) 0:03:47.954 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 16:17:39 -0400 (0:00:00.087) 0:03:48.042 ********** ok: [managed-node8] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 16:17:39 -0400 (0:00:00.144) 0:03:48.187 ********** ok: [managed-node8] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "fs_type": "swap", "name": "test1", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 16:17:40 -0400 (0:00:00.093) 0:03:48.281 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 16:17:40 -0400 (0:00:00.048) 0:03:48.329 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 16:17:40 -0400 (0:00:00.058) 0:03:48.387 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 16:17:40 -0400 (0:00:00.048) 0:03:48.436 ********** ok: [managed-node8] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 16:17:42 -0400 (0:00:01.949) 0:03:50.386 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 16:17:42 -0400 (0:00:00.119) 0:03:50.505 ********** changed: [managed-node8] => { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "ext3" }, { "action": "create format", "device": "/dev/sda", "fs_type": "swap" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "ext3", "path": "/opt/test", "src": "UUID=8688be8d-cb24-47e7-b41d-ab163b48e7dd", "state": "absent" }, { "path": "/opt/test", "state": "absent" }, { "dump": 0, "fstype": "swap", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "none", "src": "UUID=6d043760-8f2b-4965-ace4-dcb5a150d71e", "state": "present" } ], "packages": [ "e2fsprogs", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=6d043760-8f2b-4965-ace4-dcb5a150d71e", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "swap", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Monday 20 April 2026 16:17:46 -0400 (0:00:04.744) 0:03:55.250 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Monday 20 April 2026 16:17:47 -0400 (0:00:00.036) 0:03:55.286 ********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1776716208.9072328, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "881420d87525fee8290f16a81681f44134ef919a", "ctime": 1776716208.9042327, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 142606535, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776716208.9042327, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1413, "uid": 0, "version": "3187282743", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 20 April 2026 16:17:47 -0400 (0:00:00.676) 0:03:55.962 ********** ok: [managed-node8] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 16:17:48 -0400 (0:00:00.673) 0:03:56.636 ********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Monday 20 April 2026 16:17:48 -0400 (0:00:00.130) 0:03:56.766 ********** ok: [managed-node8] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "ext3" }, { "action": "create format", "device": "/dev/sda", "fs_type": "swap" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "ext3", "path": "/opt/test", "src": "UUID=8688be8d-cb24-47e7-b41d-ab163b48e7dd", "state": "absent" }, { "path": "/opt/test", "state": "absent" }, { "dump": 0, "fstype": "swap", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "none", "src": "UUID=6d043760-8f2b-4965-ace4-dcb5a150d71e", "state": "present" } ], "packages": [ "e2fsprogs", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=6d043760-8f2b-4965-ace4-dcb5a150d71e", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "swap", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Monday 20 April 2026 16:17:48 -0400 (0:00:00.113) 0:03:56.880 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Monday 20 April 2026 16:17:48 -0400 (0:00:00.113) 0:03:56.994 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=6d043760-8f2b-4965-ace4-dcb5a150d71e", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "swap", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Monday 20 April 2026 16:17:48 -0400 (0:00:00.088) 0:03:57.082 ********** changed: [managed-node8] => (item={'src': 'UUID=8688be8d-cb24-47e7-b41d-ab163b48e7dd', 'path': '/opt/test', 'state': 'absent', 'fstype': 'ext3'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext3", "mount_info": { "fstype": "ext3", "path": "/opt/test", "src": "UUID=8688be8d-cb24-47e7-b41d-ab163b48e7dd", "state": "absent" }, "name": "/opt/test", "opts": "defaults", "passno": "0", "src": "UUID=8688be8d-cb24-47e7-b41d-ab163b48e7dd" } ok: [managed-node8] => (item={'path': '/opt/test', 'state': 'absent'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "mount_info": { "path": "/opt/test", "state": "absent" }, "name": "/opt/test", "opts": "defaults", "passno": "0" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Monday 20 April 2026 16:17:50 -0400 (0:00:01.268) 0:03:58.351 ********** ok: [managed-node8] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Monday 20 April 2026 16:17:50 -0400 (0:00:00.777) 0:03:59.128 ********** changed: [managed-node8] => (item={'src': 'UUID=6d043760-8f2b-4965-ace4-dcb5a150d71e', 'path': 'none', 'fstype': 'swap', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'present', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "swap", "mount_info": { "dump": 0, "fstype": "swap", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "none", "src": "UUID=6d043760-8f2b-4965-ace4-dcb5a150d71e", "state": "present" }, "name": "none", "opts": "defaults", "passno": "0", "src": "UUID=6d043760-8f2b-4965-ace4-dcb5a150d71e" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Monday 20 April 2026 16:17:51 -0400 (0:00:00.787) 0:03:59.916 ********** skipping: [managed-node8] => (item={'src': 'UUID=6d043760-8f2b-4965-ace4-dcb5a150d71e', 'path': 'none', 'fstype': 'swap', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'present', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "swap", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "none", "src": "UUID=6d043760-8f2b-4965-ace4-dcb5a150d71e", "state": "present" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Monday 20 April 2026 16:17:51 -0400 (0:00:00.043) 0:03:59.960 ********** ok: [managed-node8] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Monday 20 April 2026 16:17:52 -0400 (0:00:00.865) 0:04:00.825 ********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1776713110.4423337, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1716968941.893, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 135, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1716968586.525, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "1157759751", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Monday 20 April 2026 16:17:52 -0400 (0:00:00.368) 0:04:01.193 ********** TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Monday 20 April 2026 16:17:52 -0400 (0:00:00.022) 0:04:01.216 ********** ok: [managed-node8] TASK [fedora.linux_system_roles.storage : Record role success fingerprint] ***** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:16 Monday 20 April 2026 16:17:53 -0400 (0:00:00.837) 0:04:02.053 ********** ok: [managed-node8] => { "changed": false } TASK [Verify results - 5] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_swap.yml:89 Monday 20 April 2026 16:17:54 -0400 (0:00:00.383) 0:04:02.437 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node8 TASK [Print out pool information] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Monday 20 April 2026 16:17:54 -0400 (0:00:00.047) 0:04:02.484 ********** skipping: [managed-node8] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Monday 20 April 2026 16:17:54 -0400 (0:00:00.020) 0:04:02.505 ********** ok: [managed-node8] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=6d043760-8f2b-4965-ace4-dcb5a150d71e", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "swap", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Monday 20 April 2026 16:17:54 -0400 (0:00:00.033) 0:04:02.539 ********** ok: [managed-node8] => { "changed": false, "info": { "/dev/sda": { "fstype": "swap", "label": "", "mountpoint": "[SWAP]", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "6d043760-8f2b-4965-ace4-dcb5a150d71e" }, "/dev/sdb": { "fstype": "ext3", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "61b4fe33-23e6-4591-8ccf-71b352914ac9" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Monday 20 April 2026 16:17:54 -0400 (0:00:00.463) 0:04:03.003 ********** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002460", "end": "2026-04-20 16:17:55.071264", "rc": 0, "start": "2026-04-20 16:17:55.068804" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 UUID=6d043760-8f2b-4965-ace4-dcb5a150d71e none swap defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Monday 20 April 2026 16:17:55 -0400 (0:00:00.409) 0:04:03.412 ********** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002328", "end": "2026-04-20 16:17:55.501621", "failed_when_result": false, "rc": 0, "start": "2026-04-20 16:17:55.499293" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Monday 20 April 2026 16:17:55 -0400 (0:00:00.423) 0:04:03.836 ********** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Monday 20 April 2026 16:17:55 -0400 (0:00:00.064) 0:04:03.900 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node8 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Monday 20 April 2026 16:17:55 -0400 (0:00:00.045) 0:04:03.946 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Monday 20 April 2026 16:17:55 -0400 (0:00:00.035) 0:04:03.982 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node8 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node8 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node8 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node8 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node8 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node8 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node8 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node8 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Monday 20 April 2026 16:17:55 -0400 (0:00:00.171) 0:04:04.154 ********** ok: [managed-node8] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Monday 20 April 2026 16:17:55 -0400 (0:00:00.046) 0:04:04.201 ********** ok: [managed-node8] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "[SWAP]", "storage_test_swap_expected_matches": "1" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Monday 20 April 2026 16:17:55 -0400 (0:00:00.030) 0:04:04.231 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Monday 20 April 2026 16:17:56 -0400 (0:00:00.035) 0:04:04.267 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Monday 20 April 2026 16:17:56 -0400 (0:00:00.031) 0:04:04.298 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Monday 20 April 2026 16:17:56 -0400 (0:00:00.018) 0:04:04.316 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Monday 20 April 2026 16:17:56 -0400 (0:00:00.018) 0:04:04.334 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Monday 20 April 2026 16:17:56 -0400 (0:00:00.019) 0:04:04.354 ********** ok: [managed-node8] => { "changed": false, "cmd": [ "realpath", "/dev/sda" ], "delta": "0:00:00.002433", "end": "2026-04-20 16:17:56.439998", "rc": 0, "start": "2026-04-20 16:17:56.437565" } STDOUT: /dev/sda TASK [Gather swap info] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Monday 20 April 2026 16:17:56 -0400 (0:00:00.406) 0:04:04.761 ********** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/proc/swaps" ], "delta": "0:00:00.002220", "end": "2026-04-20 16:17:56.988016", "rc": 0, "start": "2026-04-20 16:17:56.985796" } STDOUT: Filename Type Size Used Priority /dev/sda partition 10485756 0 -2 TASK [Verify swap status] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Monday 20 April 2026 16:17:57 -0400 (0:00:00.563) 0:04:05.324 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Unset facts] ************************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Monday 20 April 2026 16:17:57 -0400 (0:00:00.056) 0:04:05.380 ********** ok: [managed-node8] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Monday 20 April 2026 16:17:57 -0400 (0:00:00.037) 0:04:05.418 ********** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [ "UUID=6d043760-8f2b-4965-ace4-dcb5a150d71e " ], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Monday 20 April 2026 16:17:57 -0400 (0:00:00.093) 0:04:05.512 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Monday 20 April 2026 16:17:57 -0400 (0:00:00.073) 0:04:05.586 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Monday 20 April 2026 16:17:57 -0400 (0:00:00.099) 0:04:05.686 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Monday 20 April 2026 16:17:57 -0400 (0:00:00.042) 0:04:05.728 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Monday 20 April 2026 16:17:57 -0400 (0:00:00.046) 0:04:05.774 ********** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Monday 20 April 2026 16:17:57 -0400 (0:00:00.037) 0:04:05.813 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Monday 20 April 2026 16:17:57 -0400 (0:00:00.053) 0:04:05.866 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Monday 20 April 2026 16:17:57 -0400 (0:00:00.051) 0:04:05.917 ********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1776716266.8444288, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776716266.8294287, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 40009, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1776716266.8294287, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Monday 20 April 2026 16:17:58 -0400 (0:00:00.681) 0:04:06.599 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Monday 20 April 2026 16:17:58 -0400 (0:00:00.104) 0:04:06.704 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Monday 20 April 2026 16:17:58 -0400 (0:00:00.060) 0:04:06.764 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Monday 20 April 2026 16:17:58 -0400 (0:00:00.050) 0:04:06.815 ********** ok: [managed-node8] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Monday 20 April 2026 16:17:58 -0400 (0:00:00.033) 0:04:06.848 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Monday 20 April 2026 16:17:58 -0400 (0:00:00.039) 0:04:06.888 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Monday 20 April 2026 16:17:58 -0400 (0:00:00.080) 0:04:06.968 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Monday 20 April 2026 16:17:58 -0400 (0:00:00.033) 0:04:07.002 ********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Monday 20 April 2026 16:18:01 -0400 (0:00:02.802) 0:04:09.805 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Monday 20 April 2026 16:18:01 -0400 (0:00:00.036) 0:04:09.842 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Monday 20 April 2026 16:18:01 -0400 (0:00:00.097) 0:04:09.939 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Monday 20 April 2026 16:18:01 -0400 (0:00:00.094) 0:04:10.033 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Monday 20 April 2026 16:18:01 -0400 (0:00:00.070) 0:04:10.104 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Monday 20 April 2026 16:18:01 -0400 (0:00:00.045) 0:04:10.150 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Monday 20 April 2026 16:18:01 -0400 (0:00:00.039) 0:04:10.189 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Monday 20 April 2026 16:18:01 -0400 (0:00:00.041) 0:04:10.231 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Monday 20 April 2026 16:18:02 -0400 (0:00:00.028) 0:04:10.259 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Monday 20 April 2026 16:18:02 -0400 (0:00:00.060) 0:04:10.319 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Monday 20 April 2026 16:18:02 -0400 (0:00:00.061) 0:04:10.381 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Monday 20 April 2026 16:18:02 -0400 (0:00:00.074) 0:04:10.455 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Monday 20 April 2026 16:18:02 -0400 (0:00:00.061) 0:04:10.517 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Monday 20 April 2026 16:18:02 -0400 (0:00:00.035) 0:04:10.552 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Monday 20 April 2026 16:18:02 -0400 (0:00:00.047) 0:04:10.600 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Monday 20 April 2026 16:18:02 -0400 (0:00:00.086) 0:04:10.687 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Monday 20 April 2026 16:18:02 -0400 (0:00:00.031) 0:04:10.718 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Monday 20 April 2026 16:18:02 -0400 (0:00:00.123) 0:04:10.842 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Monday 20 April 2026 16:18:02 -0400 (0:00:00.054) 0:04:10.896 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Monday 20 April 2026 16:18:02 -0400 (0:00:00.057) 0:04:10.954 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Monday 20 April 2026 16:18:02 -0400 (0:00:00.030) 0:04:10.984 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Monday 20 April 2026 16:18:02 -0400 (0:00:00.056) 0:04:11.041 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Monday 20 April 2026 16:18:02 -0400 (0:00:00.042) 0:04:11.083 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Monday 20 April 2026 16:18:02 -0400 (0:00:00.052) 0:04:11.136 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Monday 20 April 2026 16:18:03 -0400 (0:00:00.125) 0:04:11.261 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Monday 20 April 2026 16:18:03 -0400 (0:00:00.118) 0:04:11.380 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Monday 20 April 2026 16:18:03 -0400 (0:00:00.058) 0:04:11.438 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Monday 20 April 2026 16:18:03 -0400 (0:00:00.069) 0:04:11.508 ********** ok: [managed-node8] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Monday 20 April 2026 16:18:03 -0400 (0:00:00.079) 0:04:11.588 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Monday 20 April 2026 16:18:03 -0400 (0:00:00.109) 0:04:11.698 ********** skipping: [managed-node8] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Monday 20 April 2026 16:18:03 -0400 (0:00:00.085) 0:04:11.783 ********** skipping: [managed-node8] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Monday 20 April 2026 16:18:03 -0400 (0:00:00.140) 0:04:11.924 ********** skipping: [managed-node8] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Monday 20 April 2026 16:18:03 -0400 (0:00:00.084) 0:04:12.008 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Monday 20 April 2026 16:18:03 -0400 (0:00:00.070) 0:04:12.079 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Monday 20 April 2026 16:18:03 -0400 (0:00:00.069) 0:04:12.148 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Monday 20 April 2026 16:18:03 -0400 (0:00:00.092) 0:04:12.241 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Monday 20 April 2026 16:18:04 -0400 (0:00:00.146) 0:04:12.387 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Monday 20 April 2026 16:18:04 -0400 (0:00:00.078) 0:04:12.466 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Monday 20 April 2026 16:18:04 -0400 (0:00:00.049) 0:04:12.515 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Monday 20 April 2026 16:18:04 -0400 (0:00:00.044) 0:04:12.559 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Monday 20 April 2026 16:18:04 -0400 (0:00:00.090) 0:04:12.650 ********** skipping: [managed-node8] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Monday 20 April 2026 16:18:04 -0400 (0:00:00.081) 0:04:12.731 ********** skipping: [managed-node8] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Monday 20 April 2026 16:18:04 -0400 (0:00:00.037) 0:04:12.768 ********** skipping: [managed-node8] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Monday 20 April 2026 16:18:04 -0400 (0:00:00.042) 0:04:12.811 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Monday 20 April 2026 16:18:04 -0400 (0:00:00.034) 0:04:12.846 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Monday 20 April 2026 16:18:04 -0400 (0:00:00.048) 0:04:12.894 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Monday 20 April 2026 16:18:04 -0400 (0:00:00.036) 0:04:12.931 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Monday 20 April 2026 16:18:04 -0400 (0:00:00.059) 0:04:12.990 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Monday 20 April 2026 16:18:04 -0400 (0:00:00.027) 0:04:13.018 ********** ok: [managed-node8] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Monday 20 April 2026 16:18:04 -0400 (0:00:00.065) 0:04:13.083 ********** ok: [managed-node8] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Monday 20 April 2026 16:18:04 -0400 (0:00:00.049) 0:04:13.133 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Monday 20 April 2026 16:18:04 -0400 (0:00:00.072) 0:04:13.206 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Monday 20 April 2026 16:18:05 -0400 (0:00:00.062) 0:04:13.269 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Monday 20 April 2026 16:18:05 -0400 (0:00:00.056) 0:04:13.326 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Monday 20 April 2026 16:18:05 -0400 (0:00:00.059) 0:04:13.386 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Monday 20 April 2026 16:18:05 -0400 (0:00:00.064) 0:04:13.450 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Monday 20 April 2026 16:18:05 -0400 (0:00:00.069) 0:04:13.519 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Monday 20 April 2026 16:18:05 -0400 (0:00:00.052) 0:04:13.571 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Monday 20 April 2026 16:18:05 -0400 (0:00:00.070) 0:04:13.642 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Monday 20 April 2026 16:18:05 -0400 (0:00:00.112) 0:04:13.755 ********** ok: [managed-node8] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Repeat the previous invocation to verify idempotence - 2] **************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_swap.yml:92 Monday 20 April 2026 16:18:05 -0400 (0:00:00.076) 0:04:13.832 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node8 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 16:18:05 -0400 (0:00:00.105) 0:04:13.937 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 16:18:05 -0400 (0:00:00.078) 0:04:14.016 ********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 16:18:05 -0400 (0:00:00.064) 0:04:14.080 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 16:18:05 -0400 (0:00:00.073) 0:04:14.154 ********** ok: [managed-node8] TASK [fedora.linux_system_roles.storage : Record role begin fingerprint] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 16:18:06 -0400 (0:00:01.054) 0:04:15.208 ********** ok: [managed-node8] => { "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:13 Monday 20 April 2026 16:18:07 -0400 (0:00:00.737) 0:04:15.946 ********** skipping: [managed-node8] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node8] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node8] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node8] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:31 Monday 20 April 2026 16:18:07 -0400 (0:00:00.111) 0:04:16.058 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:36 Monday 20 April 2026 16:18:07 -0400 (0:00:00.092) 0:04:16.151 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 20 April 2026 16:18:07 -0400 (0:00:00.070) 0:04:16.222 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 16:18:08 -0400 (0:00:00.059) 0:04:16.282 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 16:18:08 -0400 (0:00:00.068) 0:04:16.350 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 16:18:08 -0400 (0:00:00.147) 0:04:16.497 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 16:18:08 -0400 (0:00:00.075) 0:04:16.573 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 16:18:08 -0400 (0:00:00.079) 0:04:16.652 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 16:18:08 -0400 (0:00:00.040) 0:04:16.692 ********** ok: [managed-node8] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 16:18:08 -0400 (0:00:00.102) 0:04:16.794 ********** ok: [managed-node8] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "fs_type": "swap", "name": "test1", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 16:18:08 -0400 (0:00:00.033) 0:04:16.828 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 16:18:08 -0400 (0:00:00.036) 0:04:16.865 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 16:18:08 -0400 (0:00:00.062) 0:04:16.927 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 16:18:08 -0400 (0:00:00.106) 0:04:17.033 ********** ok: [managed-node8] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 16:18:10 -0400 (0:00:02.114) 0:04:19.148 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 16:18:10 -0400 (0:00:00.095) 0:04:19.243 ********** ok: [managed-node8] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "dump": 0, "fstype": "swap", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "none", "src": "UUID=6d043760-8f2b-4965-ace4-dcb5a150d71e", "state": "present" } ], "packages": [ "e2fsprogs", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=6d043760-8f2b-4965-ace4-dcb5a150d71e", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "swap", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Monday 20 April 2026 16:18:15 -0400 (0:00:04.202) 0:04:23.446 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Monday 20 April 2026 16:18:15 -0400 (0:00:00.019) 0:04:23.466 ********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1776716272.3284473, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "fdf15644903f3ac696cf610d2604fe0e01673aaf", "ctime": 1776716271.5414448, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 142606535, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776716271.5414448, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1408, "uid": 0, "version": "3187282743", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 20 April 2026 16:18:15 -0400 (0:00:00.364) 0:04:23.830 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 16:18:15 -0400 (0:00:00.023) 0:04:23.854 ********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Monday 20 April 2026 16:18:15 -0400 (0:00:00.050) 0:04:23.904 ********** ok: [managed-node8] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "dump": 0, "fstype": "swap", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "none", "src": "UUID=6d043760-8f2b-4965-ace4-dcb5a150d71e", "state": "present" } ], "packages": [ "e2fsprogs", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=6d043760-8f2b-4965-ace4-dcb5a150d71e", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "swap", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Monday 20 April 2026 16:18:15 -0400 (0:00:00.032) 0:04:23.937 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Monday 20 April 2026 16:18:15 -0400 (0:00:00.030) 0:04:23.967 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=6d043760-8f2b-4965-ace4-dcb5a150d71e", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "swap", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Monday 20 April 2026 16:18:15 -0400 (0:00:00.025) 0:04:23.993 ********** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Monday 20 April 2026 16:18:15 -0400 (0:00:00.023) 0:04:24.017 ********** ok: [managed-node8] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Monday 20 April 2026 16:18:16 -0400 (0:00:00.732) 0:04:24.750 ********** ok: [managed-node8] => (item={'src': 'UUID=6d043760-8f2b-4965-ace4-dcb5a150d71e', 'path': 'none', 'fstype': 'swap', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'present', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "swap", "mount_info": { "dump": 0, "fstype": "swap", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "none", "src": "UUID=6d043760-8f2b-4965-ace4-dcb5a150d71e", "state": "present" }, "name": "none", "opts": "defaults", "passno": "0", "src": "UUID=6d043760-8f2b-4965-ace4-dcb5a150d71e" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Monday 20 April 2026 16:18:17 -0400 (0:00:00.645) 0:04:25.395 ********** skipping: [managed-node8] => (item={'src': 'UUID=6d043760-8f2b-4965-ace4-dcb5a150d71e', 'path': 'none', 'fstype': 'swap', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'present', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "swap", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "none", "src": "UUID=6d043760-8f2b-4965-ace4-dcb5a150d71e", "state": "present" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Monday 20 April 2026 16:18:17 -0400 (0:00:00.053) 0:04:25.448 ********** ok: [managed-node8] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Monday 20 April 2026 16:18:18 -0400 (0:00:00.854) 0:04:26.302 ********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1776713110.4423337, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1716968941.893, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 135, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1716968586.525, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "1157759751", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Monday 20 April 2026 16:18:18 -0400 (0:00:00.470) 0:04:26.773 ********** TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Monday 20 April 2026 16:18:18 -0400 (0:00:00.027) 0:04:26.801 ********** ok: [managed-node8] TASK [fedora.linux_system_roles.storage : Record role success fingerprint] ***** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:16 Monday 20 April 2026 16:18:19 -0400 (0:00:00.867) 0:04:27.668 ********** ok: [managed-node8] => { "changed": false } TASK [Verify results - 6] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_swap.yml:101 Monday 20 April 2026 16:18:20 -0400 (0:00:00.648) 0:04:28.317 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node8 TASK [Print out pool information] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Monday 20 April 2026 16:18:20 -0400 (0:00:00.170) 0:04:28.488 ********** skipping: [managed-node8] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Monday 20 April 2026 16:18:20 -0400 (0:00:00.038) 0:04:28.526 ********** ok: [managed-node8] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=6d043760-8f2b-4965-ace4-dcb5a150d71e", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "swap", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Monday 20 April 2026 16:18:20 -0400 (0:00:00.061) 0:04:28.588 ********** ok: [managed-node8] => { "changed": false, "info": { "/dev/sda": { "fstype": "swap", "label": "", "mountpoint": "[SWAP]", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "6d043760-8f2b-4965-ace4-dcb5a150d71e" }, "/dev/sdb": { "fstype": "ext3", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "61b4fe33-23e6-4591-8ccf-71b352914ac9" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Monday 20 April 2026 16:18:20 -0400 (0:00:00.619) 0:04:29.207 ********** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003059", "end": "2026-04-20 16:18:21.275026", "rc": 0, "start": "2026-04-20 16:18:21.271967" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 UUID=6d043760-8f2b-4965-ace4-dcb5a150d71e none swap defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Monday 20 April 2026 16:18:21 -0400 (0:00:00.399) 0:04:29.607 ********** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003021", "end": "2026-04-20 16:18:21.942590", "failed_when_result": false, "rc": 0, "start": "2026-04-20 16:18:21.939569" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Monday 20 April 2026 16:18:22 -0400 (0:00:00.663) 0:04:30.271 ********** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Monday 20 April 2026 16:18:22 -0400 (0:00:00.064) 0:04:30.336 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node8 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Monday 20 April 2026 16:18:22 -0400 (0:00:00.118) 0:04:30.455 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Monday 20 April 2026 16:18:22 -0400 (0:00:00.045) 0:04:30.500 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node8 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node8 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node8 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node8 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node8 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node8 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node8 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node8 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Monday 20 April 2026 16:18:22 -0400 (0:00:00.315) 0:04:30.816 ********** ok: [managed-node8] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Monday 20 April 2026 16:18:22 -0400 (0:00:00.050) 0:04:30.866 ********** ok: [managed-node8] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "[SWAP]", "storage_test_swap_expected_matches": "1" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Monday 20 April 2026 16:18:22 -0400 (0:00:00.058) 0:04:30.925 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Monday 20 April 2026 16:18:22 -0400 (0:00:00.092) 0:04:31.017 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Monday 20 April 2026 16:18:22 -0400 (0:00:00.048) 0:04:31.066 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Monday 20 April 2026 16:18:22 -0400 (0:00:00.030) 0:04:31.097 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Monday 20 April 2026 16:18:22 -0400 (0:00:00.047) 0:04:31.144 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Monday 20 April 2026 16:18:22 -0400 (0:00:00.041) 0:04:31.186 ********** ok: [managed-node8] => { "changed": false, "cmd": [ "realpath", "/dev/sda" ], "delta": "0:00:00.002926", "end": "2026-04-20 16:18:23.276256", "rc": 0, "start": "2026-04-20 16:18:23.273330" } STDOUT: /dev/sda TASK [Gather swap info] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Monday 20 April 2026 16:18:23 -0400 (0:00:00.406) 0:04:31.593 ********** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/proc/swaps" ], "delta": "0:00:00.002440", "end": "2026-04-20 16:18:23.784866", "rc": 0, "start": "2026-04-20 16:18:23.782426" } STDOUT: Filename Type Size Used Priority /dev/sda partition 10485756 0 -2 TASK [Verify swap status] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Monday 20 April 2026 16:18:23 -0400 (0:00:00.527) 0:04:32.120 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Unset facts] ************************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Monday 20 April 2026 16:18:23 -0400 (0:00:00.056) 0:04:32.176 ********** ok: [managed-node8] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Monday 20 April 2026 16:18:23 -0400 (0:00:00.034) 0:04:32.211 ********** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [ "UUID=6d043760-8f2b-4965-ace4-dcb5a150d71e " ], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Monday 20 April 2026 16:18:24 -0400 (0:00:00.150) 0:04:32.361 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Monday 20 April 2026 16:18:24 -0400 (0:00:00.072) 0:04:32.433 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Monday 20 April 2026 16:18:24 -0400 (0:00:00.052) 0:04:32.486 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Monday 20 April 2026 16:18:24 -0400 (0:00:00.158) 0:04:32.644 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Monday 20 April 2026 16:18:24 -0400 (0:00:00.069) 0:04:32.714 ********** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Monday 20 April 2026 16:18:24 -0400 (0:00:00.078) 0:04:32.793 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Monday 20 April 2026 16:18:24 -0400 (0:00:00.076) 0:04:32.869 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Monday 20 April 2026 16:18:24 -0400 (0:00:00.131) 0:04:33.001 ********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1776716266.8444288, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776716266.8294287, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 40009, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1776716266.8294287, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Monday 20 April 2026 16:18:25 -0400 (0:00:00.736) 0:04:33.738 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Monday 20 April 2026 16:18:25 -0400 (0:00:00.067) 0:04:33.805 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Monday 20 April 2026 16:18:25 -0400 (0:00:00.035) 0:04:33.841 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Monday 20 April 2026 16:18:25 -0400 (0:00:00.056) 0:04:33.897 ********** ok: [managed-node8] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Monday 20 April 2026 16:18:25 -0400 (0:00:00.037) 0:04:33.935 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Monday 20 April 2026 16:18:25 -0400 (0:00:00.025) 0:04:33.961 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Monday 20 April 2026 16:18:25 -0400 (0:00:00.033) 0:04:33.994 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Monday 20 April 2026 16:18:25 -0400 (0:00:00.031) 0:04:34.025 ********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Monday 20 April 2026 16:18:28 -0400 (0:00:02.511) 0:04:36.537 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Monday 20 April 2026 16:18:28 -0400 (0:00:00.027) 0:04:36.564 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Monday 20 April 2026 16:18:28 -0400 (0:00:00.021) 0:04:36.586 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Monday 20 April 2026 16:18:28 -0400 (0:00:00.036) 0:04:36.622 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Monday 20 April 2026 16:18:28 -0400 (0:00:00.025) 0:04:36.648 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Monday 20 April 2026 16:18:28 -0400 (0:00:00.021) 0:04:36.669 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Monday 20 April 2026 16:18:28 -0400 (0:00:00.019) 0:04:36.689 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Monday 20 April 2026 16:18:28 -0400 (0:00:00.019) 0:04:36.708 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Monday 20 April 2026 16:18:28 -0400 (0:00:00.019) 0:04:36.727 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Monday 20 April 2026 16:18:28 -0400 (0:00:00.026) 0:04:36.754 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Monday 20 April 2026 16:18:28 -0400 (0:00:00.031) 0:04:36.786 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Monday 20 April 2026 16:18:28 -0400 (0:00:00.020) 0:04:36.806 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Monday 20 April 2026 16:18:28 -0400 (0:00:00.024) 0:04:36.830 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Monday 20 April 2026 16:18:28 -0400 (0:00:00.022) 0:04:36.853 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Monday 20 April 2026 16:18:28 -0400 (0:00:00.042) 0:04:36.896 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Monday 20 April 2026 16:18:28 -0400 (0:00:00.034) 0:04:36.931 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Monday 20 April 2026 16:18:28 -0400 (0:00:00.022) 0:04:36.953 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Monday 20 April 2026 16:18:28 -0400 (0:00:00.030) 0:04:36.983 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Monday 20 April 2026 16:18:28 -0400 (0:00:00.027) 0:04:37.011 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Monday 20 April 2026 16:18:28 -0400 (0:00:00.026) 0:04:37.037 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Monday 20 April 2026 16:18:28 -0400 (0:00:00.027) 0:04:37.064 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Monday 20 April 2026 16:18:28 -0400 (0:00:00.036) 0:04:37.101 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Monday 20 April 2026 16:18:28 -0400 (0:00:00.035) 0:04:37.136 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Monday 20 April 2026 16:18:28 -0400 (0:00:00.030) 0:04:37.167 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Monday 20 April 2026 16:18:28 -0400 (0:00:00.042) 0:04:37.210 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Monday 20 April 2026 16:18:28 -0400 (0:00:00.032) 0:04:37.243 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Monday 20 April 2026 16:18:29 -0400 (0:00:00.026) 0:04:37.270 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Monday 20 April 2026 16:18:29 -0400 (0:00:00.021) 0:04:37.291 ********** ok: [managed-node8] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Monday 20 April 2026 16:18:29 -0400 (0:00:00.029) 0:04:37.321 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Monday 20 April 2026 16:18:29 -0400 (0:00:00.021) 0:04:37.342 ********** skipping: [managed-node8] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Monday 20 April 2026 16:18:29 -0400 (0:00:00.025) 0:04:37.368 ********** skipping: [managed-node8] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Monday 20 April 2026 16:18:29 -0400 (0:00:00.033) 0:04:37.401 ********** skipping: [managed-node8] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Monday 20 April 2026 16:18:29 -0400 (0:00:00.027) 0:04:37.428 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Monday 20 April 2026 16:18:29 -0400 (0:00:00.029) 0:04:37.458 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Monday 20 April 2026 16:18:29 -0400 (0:00:00.021) 0:04:37.479 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Monday 20 April 2026 16:18:29 -0400 (0:00:00.020) 0:04:37.499 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Monday 20 April 2026 16:18:29 -0400 (0:00:00.020) 0:04:37.520 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Monday 20 April 2026 16:18:29 -0400 (0:00:00.020) 0:04:37.541 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Monday 20 April 2026 16:18:29 -0400 (0:00:00.020) 0:04:37.561 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Monday 20 April 2026 16:18:29 -0400 (0:00:00.020) 0:04:37.581 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Monday 20 April 2026 16:18:29 -0400 (0:00:00.021) 0:04:37.603 ********** skipping: [managed-node8] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Monday 20 April 2026 16:18:29 -0400 (0:00:00.019) 0:04:37.622 ********** skipping: [managed-node8] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Monday 20 April 2026 16:18:29 -0400 (0:00:00.019) 0:04:37.642 ********** skipping: [managed-node8] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Monday 20 April 2026 16:18:29 -0400 (0:00:00.019) 0:04:37.661 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Monday 20 April 2026 16:18:29 -0400 (0:00:00.021) 0:04:37.682 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Monday 20 April 2026 16:18:29 -0400 (0:00:00.020) 0:04:37.702 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Monday 20 April 2026 16:18:29 -0400 (0:00:00.020) 0:04:37.723 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Monday 20 April 2026 16:18:29 -0400 (0:00:00.019) 0:04:37.743 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Monday 20 April 2026 16:18:29 -0400 (0:00:00.022) 0:04:37.765 ********** ok: [managed-node8] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Monday 20 April 2026 16:18:29 -0400 (0:00:00.022) 0:04:37.787 ********** ok: [managed-node8] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Monday 20 April 2026 16:18:29 -0400 (0:00:00.024) 0:04:37.811 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Monday 20 April 2026 16:18:29 -0400 (0:00:00.022) 0:04:37.834 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Monday 20 April 2026 16:18:29 -0400 (0:00:00.021) 0:04:37.855 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Monday 20 April 2026 16:18:29 -0400 (0:00:00.021) 0:04:37.877 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Monday 20 April 2026 16:18:29 -0400 (0:00:00.021) 0:04:37.898 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Monday 20 April 2026 16:18:29 -0400 (0:00:00.019) 0:04:37.918 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Monday 20 April 2026 16:18:29 -0400 (0:00:00.029) 0:04:37.947 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Monday 20 April 2026 16:18:29 -0400 (0:00:00.020) 0:04:37.968 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Monday 20 April 2026 16:18:29 -0400 (0:00:00.024) 0:04:37.993 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Monday 20 April 2026 16:18:29 -0400 (0:00:00.017) 0:04:38.010 ********** ok: [managed-node8] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_swap.yml:104 Monday 20 April 2026 16:18:29 -0400 (0:00:00.018) 0:04:38.029 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node8 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 16:18:29 -0400 (0:00:00.051) 0:04:38.080 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 16:18:29 -0400 (0:00:00.016) 0:04:38.097 ********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 16:18:29 -0400 (0:00:00.031) 0:04:38.128 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 16:18:29 -0400 (0:00:00.021) 0:04:38.150 ********** ok: [managed-node8] TASK [fedora.linux_system_roles.storage : Record role begin fingerprint] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 16:18:30 -0400 (0:00:00.791) 0:04:38.942 ********** ok: [managed-node8] => { "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:13 Monday 20 April 2026 16:18:31 -0400 (0:00:00.339) 0:04:39.282 ********** skipping: [managed-node8] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node8] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node8] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node8] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:31 Monday 20 April 2026 16:18:31 -0400 (0:00:00.051) 0:04:39.333 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:36 Monday 20 April 2026 16:18:31 -0400 (0:00:00.017) 0:04:39.350 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 20 April 2026 16:18:31 -0400 (0:00:00.015) 0:04:39.366 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 16:18:31 -0400 (0:00:00.016) 0:04:39.383 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 16:18:31 -0400 (0:00:00.015) 0:04:39.399 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 16:18:31 -0400 (0:00:00.046) 0:04:39.445 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 16:18:31 -0400 (0:00:00.021) 0:04:39.466 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 16:18:31 -0400 (0:00:00.018) 0:04:39.484 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 16:18:31 -0400 (0:00:00.017) 0:04:39.502 ********** ok: [managed-node8] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 16:18:31 -0400 (0:00:00.023) 0:04:39.526 ********** ok: [managed-node8] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "mount_point": "/opt/test", "name": "test1", "state": "absent", "type": "disk" }, { "disks": [ "sdb" ], "mount_point": "none", "name": "test2", "state": "absent", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 16:18:31 -0400 (0:00:00.027) 0:04:39.554 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 16:18:31 -0400 (0:00:00.018) 0:04:39.572 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 16:18:31 -0400 (0:00:00.028) 0:04:39.600 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 16:18:31 -0400 (0:00:00.026) 0:04:39.626 ********** ok: [managed-node8] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 16:18:33 -0400 (0:00:01.906) 0:04:41.533 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 16:18:33 -0400 (0:00:00.094) 0:04:41.628 ********** changed: [managed-node8] => { "actions": [ { "action": "destroy format", "device": "/dev/sdb", "fs_type": "ext3" }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "swap" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "swap", "path": "none", "src": "UUID=6d043760-8f2b-4965-ace4-dcb5a150d71e", "state": "absent" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=6d043760-8f2b-4965-ace4-dcb5a150d71e", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "swap", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null }, { "_device": "/dev/sdb", "_mount_id": "UUID=61b4fe33-23e6-4591-8ccf-71b352914ac9", "_raw_device": "/dev/sdb", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext3", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "none", "mount_user": null, "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Monday 20 April 2026 16:18:37 -0400 (0:00:04.409) 0:04:46.037 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Monday 20 April 2026 16:18:37 -0400 (0:00:00.024) 0:04:46.062 ********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1776716272.3284473, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "fdf15644903f3ac696cf610d2604fe0e01673aaf", "ctime": 1776716271.5414448, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 142606535, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776716271.5414448, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1408, "uid": 0, "version": "3187282743", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 20 April 2026 16:18:38 -0400 (0:00:00.350) 0:04:46.412 ********** ok: [managed-node8] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 16:18:38 -0400 (0:00:00.360) 0:04:46.772 ********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Monday 20 April 2026 16:18:38 -0400 (0:00:00.029) 0:04:46.802 ********** ok: [managed-node8] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sdb", "fs_type": "ext3" }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "swap" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "swap", "path": "none", "src": "UUID=6d043760-8f2b-4965-ace4-dcb5a150d71e", "state": "absent" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=6d043760-8f2b-4965-ace4-dcb5a150d71e", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "swap", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null }, { "_device": "/dev/sdb", "_mount_id": "UUID=61b4fe33-23e6-4591-8ccf-71b352914ac9", "_raw_device": "/dev/sdb", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext3", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "none", "mount_user": null, "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Monday 20 April 2026 16:18:38 -0400 (0:00:00.023) 0:04:46.825 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Monday 20 April 2026 16:18:38 -0400 (0:00:00.021) 0:04:46.846 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=6d043760-8f2b-4965-ace4-dcb5a150d71e", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "swap", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null }, { "_device": "/dev/sdb", "_mount_id": "UUID=61b4fe33-23e6-4591-8ccf-71b352914ac9", "_raw_device": "/dev/sdb", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext3", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "none", "mount_user": null, "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Monday 20 April 2026 16:18:38 -0400 (0:00:00.022) 0:04:46.869 ********** changed: [managed-node8] => (item={'src': 'UUID=6d043760-8f2b-4965-ace4-dcb5a150d71e', 'path': 'none', 'state': 'absent', 'fstype': 'swap'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "swap", "mount_info": { "fstype": "swap", "path": "none", "src": "UUID=6d043760-8f2b-4965-ace4-dcb5a150d71e", "state": "absent" }, "name": "none", "opts": "defaults", "passno": "0", "src": "UUID=6d043760-8f2b-4965-ace4-dcb5a150d71e" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Monday 20 April 2026 16:18:38 -0400 (0:00:00.378) 0:04:47.247 ********** ok: [managed-node8] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Monday 20 April 2026 16:18:39 -0400 (0:00:00.694) 0:04:47.942 ********** TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Monday 20 April 2026 16:18:39 -0400 (0:00:00.020) 0:04:47.962 ********** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Monday 20 April 2026 16:18:39 -0400 (0:00:00.020) 0:04:47.983 ********** ok: [managed-node8] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Monday 20 April 2026 16:18:40 -0400 (0:00:00.603) 0:04:48.587 ********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1776713110.4423337, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1716968941.893, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 135, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1716968586.525, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "1157759751", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Monday 20 April 2026 16:18:40 -0400 (0:00:00.346) 0:04:48.934 ********** TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Monday 20 April 2026 16:18:40 -0400 (0:00:00.017) 0:04:48.951 ********** ok: [managed-node8] TASK [fedora.linux_system_roles.storage : Record role success fingerprint] ***** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:16 Monday 20 April 2026 16:18:41 -0400 (0:00:00.746) 0:04:49.697 ********** ok: [managed-node8] => { "changed": false } TASK [Verify results - 7] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_swap.yml:119 Monday 20 April 2026 16:18:41 -0400 (0:00:00.514) 0:04:50.212 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node8 TASK [Print out pool information] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Monday 20 April 2026 16:18:42 -0400 (0:00:00.068) 0:04:50.280 ********** skipping: [managed-node8] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Monday 20 April 2026 16:18:42 -0400 (0:00:00.028) 0:04:50.309 ********** ok: [managed-node8] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=6d043760-8f2b-4965-ace4-dcb5a150d71e", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "swap", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null }, { "_device": "/dev/sdb", "_mount_id": "UUID=61b4fe33-23e6-4591-8ccf-71b352914ac9", "_raw_device": "/dev/sdb", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext3", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "none", "mount_user": null, "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Monday 20 April 2026 16:18:42 -0400 (0:00:00.039) 0:04:50.348 ********** ok: [managed-node8] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Monday 20 April 2026 16:18:42 -0400 (0:00:00.367) 0:04:50.716 ********** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002369", "end": "2026-04-20 16:18:42.774617", "rc": 0, "start": "2026-04-20 16:18:42.772248" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Monday 20 April 2026 16:18:42 -0400 (0:00:00.369) 0:04:51.085 ********** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:01.003872", "end": "2026-04-20 16:18:44.171742", "failed_when_result": false, "rc": 0, "start": "2026-04-20 16:18:43.167870" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Monday 20 April 2026 16:18:44 -0400 (0:00:01.455) 0:04:52.541 ********** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Monday 20 April 2026 16:18:44 -0400 (0:00:00.033) 0:04:52.574 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node8 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node8 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Monday 20 April 2026 16:18:44 -0400 (0:00:00.095) 0:04:52.670 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Monday 20 April 2026 16:18:44 -0400 (0:00:00.029) 0:04:52.700 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node8 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node8 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node8 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node8 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node8 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node8 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node8 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node8 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Monday 20 April 2026 16:18:44 -0400 (0:00:00.159) 0:04:52.859 ********** ok: [managed-node8] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Monday 20 April 2026 16:18:44 -0400 (0:00:00.026) 0:04:52.885 ********** ok: [managed-node8] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "[SWAP]", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Monday 20 April 2026 16:18:44 -0400 (0:00:00.029) 0:04:52.915 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Monday 20 April 2026 16:18:44 -0400 (0:00:00.030) 0:04:52.945 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Monday 20 April 2026 16:18:44 -0400 (0:00:00.015) 0:04:52.961 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Monday 20 April 2026 16:18:44 -0400 (0:00:00.021) 0:04:52.982 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Monday 20 April 2026 16:18:44 -0400 (0:00:00.023) 0:04:53.005 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Monday 20 April 2026 16:18:44 -0400 (0:00:00.024) 0:04:53.030 ********** ok: [managed-node8] => { "changed": false, "cmd": [ "realpath", "/dev/sda" ], "delta": "0:00:00.002316", "end": "2026-04-20 16:18:45.265363", "rc": 0, "start": "2026-04-20 16:18:45.263047" } STDOUT: /dev/sda TASK [Gather swap info] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Monday 20 April 2026 16:18:45 -0400 (0:00:00.540) 0:04:53.571 ********** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/proc/swaps" ], "delta": "0:00:00.002290", "end": "2026-04-20 16:18:45.647777", "rc": 0, "start": "2026-04-20 16:18:45.645487" } STDOUT: Filename Type Size Used Priority TASK [Verify swap status] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Monday 20 April 2026 16:18:45 -0400 (0:00:00.415) 0:04:53.986 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Unset facts] ************************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Monday 20 April 2026 16:18:45 -0400 (0:00:00.134) 0:04:54.121 ********** ok: [managed-node8] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Monday 20 April 2026 16:18:45 -0400 (0:00:00.042) 0:04:54.164 ********** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Monday 20 April 2026 16:18:46 -0400 (0:00:00.128) 0:04:54.292 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Monday 20 April 2026 16:18:46 -0400 (0:00:00.023) 0:04:54.316 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Monday 20 April 2026 16:18:46 -0400 (0:00:00.037) 0:04:54.354 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Monday 20 April 2026 16:18:46 -0400 (0:00:00.031) 0:04:54.386 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Monday 20 April 2026 16:18:46 -0400 (0:00:00.029) 0:04:54.416 ********** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Monday 20 April 2026 16:18:46 -0400 (0:00:00.020) 0:04:54.437 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Monday 20 April 2026 16:18:46 -0400 (0:00:00.019) 0:04:54.456 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Monday 20 April 2026 16:18:46 -0400 (0:00:00.021) 0:04:54.478 ********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1776716317.6816008, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776716317.6816008, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 40009, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1776716317.6816008, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Monday 20 April 2026 16:18:46 -0400 (0:00:00.364) 0:04:54.842 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Monday 20 April 2026 16:18:46 -0400 (0:00:00.031) 0:04:54.874 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Monday 20 April 2026 16:18:46 -0400 (0:00:00.036) 0:04:54.910 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Monday 20 April 2026 16:18:46 -0400 (0:00:00.039) 0:04:54.949 ********** ok: [managed-node8] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Monday 20 April 2026 16:18:46 -0400 (0:00:00.043) 0:04:54.992 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Monday 20 April 2026 16:18:46 -0400 (0:00:00.030) 0:04:55.023 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Monday 20 April 2026 16:18:46 -0400 (0:00:00.043) 0:04:55.067 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Monday 20 April 2026 16:18:46 -0400 (0:00:00.027) 0:04:55.094 ********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Monday 20 April 2026 16:18:49 -0400 (0:00:02.691) 0:04:57.786 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Monday 20 April 2026 16:18:49 -0400 (0:00:00.042) 0:04:57.829 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Monday 20 April 2026 16:18:49 -0400 (0:00:00.025) 0:04:57.854 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Monday 20 April 2026 16:18:49 -0400 (0:00:00.046) 0:04:57.901 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Monday 20 April 2026 16:18:49 -0400 (0:00:00.029) 0:04:57.931 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Monday 20 April 2026 16:18:49 -0400 (0:00:00.030) 0:04:57.962 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Monday 20 April 2026 16:18:49 -0400 (0:00:00.017) 0:04:57.979 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Monday 20 April 2026 16:18:49 -0400 (0:00:00.025) 0:04:58.005 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Monday 20 April 2026 16:18:49 -0400 (0:00:00.016) 0:04:58.022 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Monday 20 April 2026 16:18:49 -0400 (0:00:00.034) 0:04:58.057 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Monday 20 April 2026 16:18:49 -0400 (0:00:00.034) 0:04:58.091 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Monday 20 April 2026 16:18:49 -0400 (0:00:00.036) 0:04:58.128 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Monday 20 April 2026 16:18:49 -0400 (0:00:00.040) 0:04:58.168 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Monday 20 April 2026 16:18:49 -0400 (0:00:00.037) 0:04:58.206 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Monday 20 April 2026 16:18:49 -0400 (0:00:00.025) 0:04:58.231 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Monday 20 April 2026 16:18:50 -0400 (0:00:00.026) 0:04:58.258 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Monday 20 April 2026 16:18:50 -0400 (0:00:00.029) 0:04:58.288 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Monday 20 April 2026 16:18:50 -0400 (0:00:00.025) 0:04:58.313 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Monday 20 April 2026 16:18:50 -0400 (0:00:00.029) 0:04:58.343 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Monday 20 April 2026 16:18:50 -0400 (0:00:00.041) 0:04:58.384 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Monday 20 April 2026 16:18:50 -0400 (0:00:00.034) 0:04:58.419 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Monday 20 April 2026 16:18:50 -0400 (0:00:00.064) 0:04:58.484 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Monday 20 April 2026 16:18:50 -0400 (0:00:00.040) 0:04:58.525 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Monday 20 April 2026 16:18:50 -0400 (0:00:00.042) 0:04:58.567 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Monday 20 April 2026 16:18:50 -0400 (0:00:00.065) 0:04:58.632 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Monday 20 April 2026 16:18:50 -0400 (0:00:00.069) 0:04:58.702 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Monday 20 April 2026 16:18:50 -0400 (0:00:00.081) 0:04:58.783 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Monday 20 April 2026 16:18:50 -0400 (0:00:00.044) 0:04:58.828 ********** ok: [managed-node8] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Monday 20 April 2026 16:18:50 -0400 (0:00:00.049) 0:04:58.877 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Monday 20 April 2026 16:18:50 -0400 (0:00:00.044) 0:04:58.921 ********** skipping: [managed-node8] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Monday 20 April 2026 16:18:50 -0400 (0:00:00.049) 0:04:58.970 ********** skipping: [managed-node8] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Monday 20 April 2026 16:18:50 -0400 (0:00:00.035) 0:04:59.006 ********** skipping: [managed-node8] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Monday 20 April 2026 16:18:50 -0400 (0:00:00.024) 0:04:59.030 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Monday 20 April 2026 16:18:50 -0400 (0:00:00.035) 0:04:59.066 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Monday 20 April 2026 16:18:50 -0400 (0:00:00.020) 0:04:59.086 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Monday 20 April 2026 16:18:50 -0400 (0:00:00.039) 0:04:59.126 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Monday 20 April 2026 16:18:50 -0400 (0:00:00.020) 0:04:59.147 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Monday 20 April 2026 16:18:50 -0400 (0:00:00.036) 0:04:59.184 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Monday 20 April 2026 16:18:50 -0400 (0:00:00.047) 0:04:59.231 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Monday 20 April 2026 16:18:51 -0400 (0:00:00.025) 0:04:59.257 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Monday 20 April 2026 16:18:51 -0400 (0:00:00.086) 0:04:59.343 ********** skipping: [managed-node8] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Monday 20 April 2026 16:18:51 -0400 (0:00:00.064) 0:04:59.408 ********** skipping: [managed-node8] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Monday 20 April 2026 16:18:51 -0400 (0:00:00.049) 0:04:59.457 ********** skipping: [managed-node8] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Monday 20 April 2026 16:18:51 -0400 (0:00:00.046) 0:04:59.504 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Monday 20 April 2026 16:18:51 -0400 (0:00:00.049) 0:04:59.553 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Monday 20 April 2026 16:18:51 -0400 (0:00:00.046) 0:04:59.600 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Monday 20 April 2026 16:18:51 -0400 (0:00:00.035) 0:04:59.635 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Monday 20 April 2026 16:18:51 -0400 (0:00:00.040) 0:04:59.676 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Monday 20 April 2026 16:18:51 -0400 (0:00:00.031) 0:04:59.708 ********** ok: [managed-node8] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Monday 20 April 2026 16:18:51 -0400 (0:00:00.026) 0:04:59.734 ********** ok: [managed-node8] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Monday 20 April 2026 16:18:51 -0400 (0:00:00.048) 0:04:59.782 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Monday 20 April 2026 16:18:51 -0400 (0:00:00.026) 0:04:59.809 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Monday 20 April 2026 16:18:51 -0400 (0:00:00.047) 0:04:59.857 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Monday 20 April 2026 16:18:51 -0400 (0:00:00.028) 0:04:59.885 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Monday 20 April 2026 16:18:51 -0400 (0:00:00.034) 0:04:59.920 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Monday 20 April 2026 16:18:51 -0400 (0:00:00.110) 0:05:00.030 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Monday 20 April 2026 16:18:51 -0400 (0:00:00.028) 0:05:00.059 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Monday 20 April 2026 16:18:51 -0400 (0:00:00.056) 0:05:00.115 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Monday 20 April 2026 16:18:51 -0400 (0:00:00.057) 0:05:00.172 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Monday 20 April 2026 16:18:51 -0400 (0:00:00.037) 0:05:00.210 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Monday 20 April 2026 16:18:51 -0400 (0:00:00.029) 0:05:00.240 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node8 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node8 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node8 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node8 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node8 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node8 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node8 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node8 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Monday 20 April 2026 16:18:52 -0400 (0:00:00.165) 0:05:00.406 ********** ok: [managed-node8] => { "ansible_facts": { "storage_test_device_path": "/dev/sdb" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Monday 20 April 2026 16:18:52 -0400 (0:00:00.033) 0:05:00.440 ********** ok: [managed-node8] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Monday 20 April 2026 16:18:52 -0400 (0:00:00.044) 0:05:00.484 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Monday 20 April 2026 16:18:52 -0400 (0:00:00.039) 0:05:00.524 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Monday 20 April 2026 16:18:52 -0400 (0:00:00.021) 0:05:00.545 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Monday 20 April 2026 16:18:52 -0400 (0:00:00.024) 0:05:00.570 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Monday 20 April 2026 16:18:52 -0400 (0:00:00.024) 0:05:00.594 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Monday 20 April 2026 16:18:52 -0400 (0:00:00.014) 0:05:00.608 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Monday 20 April 2026 16:18:52 -0400 (0:00:00.048) 0:05:00.657 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Monday 20 April 2026 16:18:52 -0400 (0:00:00.037) 0:05:00.695 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Monday 20 April 2026 16:18:52 -0400 (0:00:00.021) 0:05:00.716 ********** ok: [managed-node8] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Monday 20 April 2026 16:18:52 -0400 (0:00:00.027) 0:05:00.744 ********** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Monday 20 April 2026 16:18:52 -0400 (0:00:00.082) 0:05:00.827 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Monday 20 April 2026 16:18:52 -0400 (0:00:00.039) 0:05:00.866 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Monday 20 April 2026 16:18:52 -0400 (0:00:00.042) 0:05:00.909 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Monday 20 April 2026 16:18:52 -0400 (0:00:00.031) 0:05:00.941 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Monday 20 April 2026 16:18:52 -0400 (0:00:00.060) 0:05:01.002 ********** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Monday 20 April 2026 16:18:52 -0400 (0:00:00.047) 0:05:01.050 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Monday 20 April 2026 16:18:52 -0400 (0:00:00.046) 0:05:01.096 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Monday 20 April 2026 16:18:52 -0400 (0:00:00.065) 0:05:01.161 ********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1776716317.6386006, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776716317.6386006, "dev": 6, "device_type": 2064, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 40051, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1776716317.6386006, "nlink": 1, "path": "/dev/sdb", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Monday 20 April 2026 16:18:53 -0400 (0:00:00.723) 0:05:01.885 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Monday 20 April 2026 16:18:53 -0400 (0:00:00.065) 0:05:01.951 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Monday 20 April 2026 16:18:53 -0400 (0:00:00.113) 0:05:02.064 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Monday 20 April 2026 16:18:53 -0400 (0:00:00.051) 0:05:02.116 ********** ok: [managed-node8] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Monday 20 April 2026 16:18:53 -0400 (0:00:00.075) 0:05:02.191 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Monday 20 April 2026 16:18:54 -0400 (0:00:00.086) 0:05:02.278 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Monday 20 April 2026 16:18:54 -0400 (0:00:00.069) 0:05:02.348 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Monday 20 April 2026 16:18:54 -0400 (0:00:00.100) 0:05:02.448 ********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Monday 20 April 2026 16:18:57 -0400 (0:00:02.911) 0:05:05.360 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Monday 20 April 2026 16:18:57 -0400 (0:00:00.056) 0:05:05.417 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Monday 20 April 2026 16:18:57 -0400 (0:00:00.036) 0:05:05.453 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Monday 20 April 2026 16:18:57 -0400 (0:00:00.047) 0:05:05.501 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Monday 20 April 2026 16:18:57 -0400 (0:00:00.060) 0:05:05.562 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Monday 20 April 2026 16:18:57 -0400 (0:00:00.044) 0:05:05.606 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Monday 20 April 2026 16:18:57 -0400 (0:00:00.057) 0:05:05.664 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Monday 20 April 2026 16:18:57 -0400 (0:00:00.034) 0:05:05.699 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Monday 20 April 2026 16:18:57 -0400 (0:00:00.027) 0:05:05.726 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Monday 20 April 2026 16:18:57 -0400 (0:00:00.069) 0:05:05.796 ********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Monday 20 April 2026 16:18:57 -0400 (0:00:00.035) 0:05:05.831 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Monday 20 April 2026 16:18:57 -0400 (0:00:00.052) 0:05:05.883 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Monday 20 April 2026 16:18:57 -0400 (0:00:00.048) 0:05:05.932 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Monday 20 April 2026 16:18:57 -0400 (0:00:00.049) 0:05:05.981 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Monday 20 April 2026 16:18:57 -0400 (0:00:00.059) 0:05:06.040 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Monday 20 April 2026 16:18:57 -0400 (0:00:00.050) 0:05:06.090 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Monday 20 April 2026 16:18:57 -0400 (0:00:00.065) 0:05:06.156 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Monday 20 April 2026 16:18:57 -0400 (0:00:00.033) 0:05:06.189 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Monday 20 April 2026 16:18:57 -0400 (0:00:00.040) 0:05:06.230 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Monday 20 April 2026 16:18:58 -0400 (0:00:00.034) 0:05:06.264 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Monday 20 April 2026 16:18:58 -0400 (0:00:00.031) 0:05:06.295 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Monday 20 April 2026 16:18:58 -0400 (0:00:00.034) 0:05:06.330 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Monday 20 April 2026 16:18:58 -0400 (0:00:00.030) 0:05:06.361 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Monday 20 April 2026 16:18:58 -0400 (0:00:00.031) 0:05:06.393 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Monday 20 April 2026 16:18:58 -0400 (0:00:00.026) 0:05:06.419 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Monday 20 April 2026 16:18:58 -0400 (0:00:00.030) 0:05:06.449 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Monday 20 April 2026 16:18:58 -0400 (0:00:00.031) 0:05:06.481 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Monday 20 April 2026 16:18:58 -0400 (0:00:00.035) 0:05:06.516 ********** ok: [managed-node8] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Monday 20 April 2026 16:18:58 -0400 (0:00:00.047) 0:05:06.564 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Monday 20 April 2026 16:18:58 -0400 (0:00:00.069) 0:05:06.634 ********** skipping: [managed-node8] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Monday 20 April 2026 16:18:58 -0400 (0:00:00.047) 0:05:06.682 ********** skipping: [managed-node8] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Monday 20 April 2026 16:18:58 -0400 (0:00:00.052) 0:05:06.735 ********** skipping: [managed-node8] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Monday 20 April 2026 16:18:58 -0400 (0:00:00.064) 0:05:06.799 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Monday 20 April 2026 16:18:58 -0400 (0:00:00.060) 0:05:06.859 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Monday 20 April 2026 16:18:58 -0400 (0:00:00.062) 0:05:06.922 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Monday 20 April 2026 16:18:58 -0400 (0:00:00.051) 0:05:06.974 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Monday 20 April 2026 16:18:58 -0400 (0:00:00.058) 0:05:07.032 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Monday 20 April 2026 16:18:58 -0400 (0:00:00.049) 0:05:07.081 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Monday 20 April 2026 16:18:58 -0400 (0:00:00.081) 0:05:07.163 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Monday 20 April 2026 16:18:58 -0400 (0:00:00.051) 0:05:07.214 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Monday 20 April 2026 16:18:59 -0400 (0:00:00.073) 0:05:07.287 ********** skipping: [managed-node8] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Monday 20 April 2026 16:18:59 -0400 (0:00:00.031) 0:05:07.319 ********** skipping: [managed-node8] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Monday 20 April 2026 16:18:59 -0400 (0:00:00.028) 0:05:07.347 ********** skipping: [managed-node8] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Monday 20 April 2026 16:18:59 -0400 (0:00:00.029) 0:05:07.376 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Monday 20 April 2026 16:18:59 -0400 (0:00:00.038) 0:05:07.415 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Monday 20 April 2026 16:18:59 -0400 (0:00:00.030) 0:05:07.445 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Monday 20 April 2026 16:18:59 -0400 (0:00:00.042) 0:05:07.487 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Monday 20 April 2026 16:18:59 -0400 (0:00:00.036) 0:05:07.524 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Monday 20 April 2026 16:18:59 -0400 (0:00:00.053) 0:05:07.577 ********** ok: [managed-node8] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Monday 20 April 2026 16:18:59 -0400 (0:00:00.044) 0:05:07.622 ********** ok: [managed-node8] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Monday 20 April 2026 16:18:59 -0400 (0:00:00.076) 0:05:07.699 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Monday 20 April 2026 16:18:59 -0400 (0:00:00.027) 0:05:07.726 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Monday 20 April 2026 16:18:59 -0400 (0:00:00.025) 0:05:07.751 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Monday 20 April 2026 16:18:59 -0400 (0:00:00.027) 0:05:07.778 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Monday 20 April 2026 16:18:59 -0400 (0:00:00.027) 0:05:07.806 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Monday 20 April 2026 16:18:59 -0400 (0:00:00.024) 0:05:07.830 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Monday 20 April 2026 16:18:59 -0400 (0:00:00.037) 0:05:07.868 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Monday 20 April 2026 16:18:59 -0400 (0:00:00.057) 0:05:07.926 ********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Monday 20 April 2026 16:18:59 -0400 (0:00:00.106) 0:05:08.033 ********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Monday 20 April 2026 16:18:59 -0400 (0:00:00.057) 0:05:08.090 ********** ok: [managed-node8] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* managed-node8 : ok=510 changed=11 unreachable=0 failed=0 skipped=638 rescued=0 ignored=0 SYSTEM ROLES ERRORS BEGIN v1 [] SYSTEM ROLES ERRORS END v1 TASKS RECAP ******************************************************************** Monday 20 April 2026 16:18:59 -0400 (0:00:00.039) 0:05:08.129 ********** =============================================================================== fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 8.13s /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 7.83s /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Record role success fingerprint ----- 6.43s /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:16 fedora.linux_system_roles.storage : Make sure blivet is available ------- 6.16s /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.18s /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 4.74s /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Gathering Facts --------------------------------------------------------- 4.71s /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_swap.yml:2 fedora.linux_system_roles.storage : Make sure required packages are installed --- 4.52s /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 4.41s /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Get service facts ------------------- 4.25s /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 4.20s /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab --- 4.20s /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 4.13s /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Get required packages --------------- 3.85s /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Ensure cryptsetup is present -------------------------------------------- 3.85s /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Ensure test packages ---------------------------------------------------- 3.84s /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:2 Ensure cryptsetup is present -------------------------------------------- 3.26s /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Ensure cryptsetup is present -------------------------------------------- 3.21s /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Ensure cryptsetup is present -------------------------------------------- 2.91s /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Ensure cryptsetup is present -------------------------------------------- 2.80s /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10