ansible-playbook 2.9.27 config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.9/site-packages/ansible executable location = /usr/local/bin/ansible-playbook python version = 3.9.19 (main, May 16 2024, 11:40:09) [GCC 8.5.0 20210514 (Red Hat 8.5.0-22)] No config file found; using defaults [WARNING]: running playbook inside collection fedora.linux_system_roles statically imported: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'jsonl', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: tests_luks2.yml ****************************************************** 1 plays in /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml PLAY [Test LUKS2] ************************************************************** TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:2 Monday 16 June 2025 17:06:53 -0400 (0:00:00.217) 0:00:00.217 *********** ok: [managed-node11] META: ran handlers TASK [Enable FIPS mode] ******************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:20 Monday 16 June 2025 17:06:57 -0400 (0:00:04.072) 0:00:04.290 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reboot] ****************************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:28 Monday 16 June 2025 17:06:58 -0400 (0:00:00.452) 0:00:04.742 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Enable FIPS mode] ******************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:39 Monday 16 June 2025 17:06:58 -0400 (0:00:00.570) 0:00:05.313 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reboot] ****************************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:43 Monday 16 June 2025 17:06:58 -0400 (0:00:00.311) 0:00:05.624 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure dracut-fips] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:53 Monday 16 June 2025 17:06:59 -0400 (0:00:00.372) 0:00:05.996 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Configure boot for FIPS] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:59 Monday 16 June 2025 17:06:59 -0400 (0:00:00.447) 0:00:06.444 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reboot] ****************************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:68 Monday 16 June 2025 17:07:00 -0400 (0:00:00.432) 0:00:06.877 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role] ************************************************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:72 Monday 16 June 2025 17:07:00 -0400 (0:00:00.453) 0:00:07.331 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 16 June 2025 17:07:01 -0400 (0:00:00.492) 0:00:07.823 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 16 June 2025 17:07:01 -0400 (0:00:00.264) 0:00:08.088 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 16 June 2025 17:07:01 -0400 (0:00:00.513) 0:00:08.602 *********** skipping: [managed-node11] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node11] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Monday 16 June 2025 17:07:02 -0400 (0:00:00.680) 0:00:09.282 *********** ok: [managed-node11] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Monday 16 June 2025 17:07:05 -0400 (0:00:02.435) 0:00:11.718 *********** ok: [managed-node11] => { "ansible_facts": { "__storage_is_ostree": false }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 16 June 2025 17:07:05 -0400 (0:00:00.492) 0:00:12.210 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 16 June 2025 17:07:05 -0400 (0:00:00.143) 0:00:12.354 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 16 June 2025 17:07:05 -0400 (0:00:00.155) 0:00:12.510 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Monday 16 June 2025 17:07:06 -0400 (0:00:00.740) 0:00:13.250 *********** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 16 June 2025 17:07:12 -0400 (0:00:05.969) 0:00:19.220 *********** ok: [managed-node11] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Monday 16 June 2025 17:07:13 -0400 (0:00:00.474) 0:00:19.694 *********** ok: [managed-node11] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Monday 16 June 2025 17:07:13 -0400 (0:00:00.330) 0:00:20.025 *********** ok: [managed-node11] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 16 June 2025 17:07:16 -0400 (0:00:02.964) 0:00:22.989 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 16 June 2025 17:07:17 -0400 (0:00:00.789) 0:00:23.779 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 16 June 2025 17:07:17 -0400 (0:00:00.133) 0:00:23.913 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 16 June 2025 17:07:17 -0400 (0:00:00.117) 0:00:24.030 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Monday 16 June 2025 17:07:17 -0400 (0:00:00.156) 0:00:24.187 *********** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Monday 16 June 2025 17:07:21 -0400 (0:00:04.326) 0:00:28.513 *********** ok: [managed-node11] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Monday 16 June 2025 17:07:25 -0400 (0:00:04.019) 0:00:32.533 *********** ok: [managed-node11] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Monday 16 June 2025 17:07:26 -0400 (0:00:00.368) 0:00:32.902 *********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Monday 16 June 2025 17:07:26 -0400 (0:00:00.536) 0:00:33.438 *********** ok: [managed-node11] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Monday 16 June 2025 17:07:28 -0400 (0:00:02.014) 0:00:35.452 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Monday 16 June 2025 17:07:28 -0400 (0:00:00.110) 0:00:35.563 *********** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1750107806.2233012, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "ab8070345774adad92683e9645714452be7be474", "ctime": 1750107804.3082857, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 364904586, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1750107804.3082857, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1343, "uid": 0, "version": "4217545767", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Monday 16 June 2025 17:07:30 -0400 (0:00:01.375) 0:00:36.938 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 16 June 2025 17:07:30 -0400 (0:00:00.262) 0:00:37.201 *********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Monday 16 June 2025 17:07:30 -0400 (0:00:00.223) 0:00:37.425 *********** ok: [managed-node11] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Monday 16 June 2025 17:07:31 -0400 (0:00:00.309) 0:00:37.734 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Monday 16 June 2025 17:07:31 -0400 (0:00:00.318) 0:00:38.053 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Monday 16 June 2025 17:07:31 -0400 (0:00:00.299) 0:00:38.353 *********** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Monday 16 June 2025 17:07:31 -0400 (0:00:00.254) 0:00:38.608 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Monday 16 June 2025 17:07:32 -0400 (0:00:00.172) 0:00:38.780 *********** TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Monday 16 June 2025 17:07:32 -0400 (0:00:00.184) 0:00:38.964 *********** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Monday 16 June 2025 17:07:32 -0400 (0:00:00.208) 0:00:39.173 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Monday 16 June 2025 17:07:32 -0400 (0:00:00.278) 0:00:39.452 *********** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1750106648.620045, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1716968941.893, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 135, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1716968586.525, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "1157759751", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Monday 16 June 2025 17:07:34 -0400 (0:00:01.395) 0:00:40.848 *********** TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Monday 16 June 2025 17:07:34 -0400 (0:00:00.147) 0:00:40.996 *********** ok: [managed-node11] TASK [Get unused disks] ******************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:76 Monday 16 June 2025 17:07:36 -0400 (0:00:01.663) 0:00:42.659 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml for managed-node11 TASK [Ensure test packages] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:2 Monday 16 June 2025 17:07:36 -0400 (0:00:00.343) 0:00:43.002 *********** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Find unused disks in the system] ***************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:11 Monday 16 June 2025 17:07:41 -0400 (0:00:04.682) 0:00:47.684 *********** ok: [managed-node11] => { "changed": false, "disks": [ "sda" ], "info": [ "Line: NAME=\"/dev/sda\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdb\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdc\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdd\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sde\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdf\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdg\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdh\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdi\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/xvda\" TYPE=\"disk\" SIZE=\"268435456000\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/xvda1\" TYPE=\"part\" SIZE=\"268434390528\" FSTYPE=\"xfs\" LOG-SEC=\"512\"", "Line type [part] is not disk: NAME=\"/dev/xvda1\" TYPE=\"part\" SIZE=\"268434390528\" FSTYPE=\"xfs\" LOG-SEC=\"512\"", "filename [xvda1] is a partition", "Disk [/dev/xvda] attrs [{'type': 'disk', 'size': '268435456000', 'fstype': '', 'ssize': '512'}] has partitions" ] } TASK [Debug why there are no unused disks] ************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:20 Monday 16 June 2025 17:07:43 -0400 (0:00:02.353) 0:00:50.038 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:29 Monday 16 June 2025 17:07:43 -0400 (0:00:00.134) 0:00:50.172 *********** ok: [managed-node11] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:34 Monday 16 June 2025 17:07:43 -0400 (0:00:00.286) 0:00:50.458 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:39 Monday 16 June 2025 17:07:44 -0400 (0:00:00.298) 0:00:50.757 *********** ok: [managed-node11] => { "unused_disks": [ "sda" ] } TASK [Test for correct handling of new encrypted volume w/ no key] ************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:85 Monday 16 June 2025 17:07:44 -0400 (0:00:00.171) 0:00:50.929 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node11 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Monday 16 June 2025 17:07:44 -0400 (0:00:00.380) 0:00:51.309 *********** ok: [managed-node11] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Monday 16 June 2025 17:07:44 -0400 (0:00:00.197) 0:00:51.507 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 16 June 2025 17:07:45 -0400 (0:00:00.347) 0:00:51.855 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 16 June 2025 17:07:45 -0400 (0:00:00.313) 0:00:52.168 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 16 June 2025 17:07:45 -0400 (0:00:00.251) 0:00:52.419 *********** skipping: [managed-node11] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node11] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Monday 16 June 2025 17:07:46 -0400 (0:00:00.331) 0:00:52.750 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Monday 16 June 2025 17:07:46 -0400 (0:00:00.488) 0:00:53.239 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 16 June 2025 17:07:46 -0400 (0:00:00.208) 0:00:53.447 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 16 June 2025 17:07:46 -0400 (0:00:00.180) 0:00:53.628 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 16 June 2025 17:07:47 -0400 (0:00:00.217) 0:00:53.845 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Monday 16 June 2025 17:07:47 -0400 (0:00:00.708) 0:00:54.553 *********** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 16 June 2025 17:07:52 -0400 (0:00:04.669) 0:00:59.223 *********** ok: [managed-node11] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Monday 16 June 2025 17:07:52 -0400 (0:00:00.220) 0:00:59.444 *********** ok: [managed-node11] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "encryption_luks_version": "luks2", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Monday 16 June 2025 17:07:52 -0400 (0:00:00.178) 0:00:59.622 *********** ok: [managed-node11] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 16 June 2025 17:07:58 -0400 (0:00:05.282) 0:01:04.905 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 16 June 2025 17:07:58 -0400 (0:00:00.411) 0:01:05.316 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 16 June 2025 17:07:58 -0400 (0:00:00.293) 0:01:05.610 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 16 June 2025 17:07:59 -0400 (0:00:00.358) 0:01:05.968 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Monday 16 June 2025 17:07:59 -0400 (0:00:00.203) 0:01:06.172 *********** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Monday 16 June 2025 17:08:04 -0400 (0:00:04.752) 0:01:10.925 *********** ok: [managed-node11] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Monday 16 June 2025 17:08:07 -0400 (0:00:02.830) 0:01:13.755 *********** ok: [managed-node11] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Monday 16 June 2025 17:08:07 -0400 (0:00:00.363) 0:01:14.119 *********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Monday 16 June 2025 17:08:07 -0400 (0:00:00.149) 0:01:14.268 *********** fatal: [managed-node11]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'foo' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Monday 16 June 2025 17:08:12 -0400 (0:00:05.216) 0:01:19.485 *********** fatal: [managed-node11]: FAILED! => { "changed": false } MSG: {'msg': "encrypted volume 'foo' missing key/password", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [], 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10737418240, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 16 June 2025 17:08:12 -0400 (0:00:00.108) 0:01:19.594 *********** TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Monday 16 June 2025 17:08:13 -0400 (0:00:00.110) 0:01:19.704 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Monday 16 June 2025 17:08:13 -0400 (0:00:00.098) 0:01:19.802 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Monday 16 June 2025 17:08:13 -0400 (0:00:00.311) 0:01:20.114 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create an encrypted disk volume w/ default fs] *************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:101 Monday 16 June 2025 17:08:13 -0400 (0:00:00.169) 0:01:20.283 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 16 June 2025 17:08:14 -0400 (0:00:00.403) 0:01:20.687 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 16 June 2025 17:08:14 -0400 (0:00:00.343) 0:01:21.031 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 16 June 2025 17:08:14 -0400 (0:00:00.587) 0:01:21.618 *********** skipping: [managed-node11] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node11] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Monday 16 June 2025 17:08:15 -0400 (0:00:00.500) 0:01:22.119 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Monday 16 June 2025 17:08:15 -0400 (0:00:00.298) 0:01:22.417 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 16 June 2025 17:08:15 -0400 (0:00:00.209) 0:01:22.626 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 16 June 2025 17:08:16 -0400 (0:00:00.191) 0:01:22.818 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 16 June 2025 17:08:16 -0400 (0:00:00.232) 0:01:23.050 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Monday 16 June 2025 17:08:16 -0400 (0:00:00.495) 0:01:23.546 *********** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 16 June 2025 17:08:21 -0400 (0:00:04.636) 0:01:28.182 *********** ok: [managed-node11] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Monday 16 June 2025 17:08:21 -0400 (0:00:00.189) 0:01:28.372 *********** ok: [managed-node11] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Monday 16 June 2025 17:08:21 -0400 (0:00:00.169) 0:01:28.541 *********** ok: [managed-node11] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 16 June 2025 17:08:27 -0400 (0:00:05.167) 0:01:33.708 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 16 June 2025 17:08:27 -0400 (0:00:00.249) 0:01:33.957 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 16 June 2025 17:08:27 -0400 (0:00:00.134) 0:01:34.092 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 16 June 2025 17:08:27 -0400 (0:00:00.178) 0:01:34.270 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Monday 16 June 2025 17:08:27 -0400 (0:00:00.139) 0:01:34.410 *********** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Monday 16 June 2025 17:08:32 -0400 (0:00:04.754) 0:01:39.165 *********** ok: [managed-node11] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Monday 16 June 2025 17:08:35 -0400 (0:00:02.762) 0:01:41.927 *********** ok: [managed-node11] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Monday 16 June 2025 17:08:35 -0400 (0:00:00.421) 0:01:42.349 *********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Monday 16 June 2025 17:08:35 -0400 (0:00:00.277) 0:01:42.626 *********** changed: [managed-node11] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-fec24d04-7e30-407c-9ca8-79d9c198f63d", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-fec24d04-7e30-407c-9ca8-79d9c198f63d", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-fec24d04-7e30-407c-9ca8-79d9c198f63d", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-fec24d04-7e30-407c-9ca8-79d9c198f63d" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-fec24d04-7e30-407c-9ca8-79d9c198f63d", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-fec24d04-7e30-407c-9ca8-79d9c198f63d", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-fec24d04-7e30-407c-9ca8-79d9c198f63d", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Monday 16 June 2025 17:08:49 -0400 (0:00:13.400) 0:01:56.026 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Monday 16 June 2025 17:08:49 -0400 (0:00:00.149) 0:01:56.176 *********** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1750107806.2233012, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "ab8070345774adad92683e9645714452be7be474", "ctime": 1750107804.3082857, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 364904586, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1750107804.3082857, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1343, "uid": 0, "version": "4217545767", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Monday 16 June 2025 17:08:51 -0400 (0:00:01.485) 0:01:57.662 *********** ok: [managed-node11] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 16 June 2025 17:08:53 -0400 (0:00:02.615) 0:02:00.277 *********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Monday 16 June 2025 17:08:53 -0400 (0:00:00.232) 0:02:00.510 *********** ok: [managed-node11] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-fec24d04-7e30-407c-9ca8-79d9c198f63d", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-fec24d04-7e30-407c-9ca8-79d9c198f63d", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-fec24d04-7e30-407c-9ca8-79d9c198f63d", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-fec24d04-7e30-407c-9ca8-79d9c198f63d" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-fec24d04-7e30-407c-9ca8-79d9c198f63d", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-fec24d04-7e30-407c-9ca8-79d9c198f63d", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-fec24d04-7e30-407c-9ca8-79d9c198f63d", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Monday 16 June 2025 17:08:54 -0400 (0:00:00.346) 0:02:00.857 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Monday 16 June 2025 17:08:54 -0400 (0:00:00.195) 0:02:01.053 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-fec24d04-7e30-407c-9ca8-79d9c198f63d", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-fec24d04-7e30-407c-9ca8-79d9c198f63d", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Monday 16 June 2025 17:08:54 -0400 (0:00:00.218) 0:02:01.272 *********** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Monday 16 June 2025 17:08:54 -0400 (0:00:00.192) 0:02:01.464 *********** ok: [managed-node11] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Monday 16 June 2025 17:08:59 -0400 (0:00:04.794) 0:02:06.259 *********** changed: [managed-node11] => (item={'src': '/dev/mapper/luks-fec24d04-7e30-407c-9ca8-79d9c198f63d', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-fec24d04-7e30-407c-9ca8-79d9c198f63d", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-fec24d04-7e30-407c-9ca8-79d9c198f63d" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Monday 16 June 2025 17:09:02 -0400 (0:00:02.627) 0:02:08.886 *********** skipping: [managed-node11] => (item={'src': '/dev/mapper/luks-fec24d04-7e30-407c-9ca8-79d9c198f63d', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-fec24d04-7e30-407c-9ca8-79d9c198f63d", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Monday 16 June 2025 17:09:02 -0400 (0:00:00.338) 0:02:09.225 *********** ok: [managed-node11] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Monday 16 June 2025 17:09:04 -0400 (0:00:01.662) 0:02:10.888 *********** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1750106648.620045, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1716968941.893, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 135, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1716968586.525, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "1157759751", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Monday 16 June 2025 17:09:05 -0400 (0:00:01.699) 0:02:12.587 *********** changed: [managed-node11] => (item={'backing_device': '/dev/sda', 'name': 'luks-fec24d04-7e30-407c-9ca8-79d9c198f63d', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-fec24d04-7e30-407c-9ca8-79d9c198f63d", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Monday 16 June 2025 17:09:07 -0400 (0:00:01.746) 0:02:14.333 *********** ok: [managed-node11] TASK [Verify role results] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:114 Monday 16 June 2025 17:09:09 -0400 (0:00:01.880) 0:02:16.214 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node11 TASK [Print out pool information] ********************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Monday 16 June 2025 17:09:10 -0400 (0:00:00.496) 0:02:16.711 *********** skipping: [managed-node11] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Monday 16 June 2025 17:09:10 -0400 (0:00:00.290) 0:02:17.001 *********** ok: [managed-node11] => { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-fec24d04-7e30-407c-9ca8-79d9c198f63d", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-fec24d04-7e30-407c-9ca8-79d9c198f63d", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Monday 16 June 2025 17:09:10 -0400 (0:00:00.254) 0:02:17.255 *********** ok: [managed-node11] => { "changed": false, "info": { "/dev/mapper/luks-fec24d04-7e30-407c-9ca8-79d9c198f63d": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-fec24d04-7e30-407c-9ca8-79d9c198f63d", "size": "10G", "type": "crypt", "uuid": "5a6a137b-42a6-42b8-bee6-390c258bd977" }, "/dev/sda": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "fec24d04-7e30-407c-9ca8-79d9c198f63d" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Monday 16 June 2025 17:09:12 -0400 (0:00:02.111) 0:02:19.367 *********** ok: [managed-node11] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002504", "end": "2025-06-16 17:09:14.647887", "rc": 0, "start": "2025-06-16 17:09:14.645383" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-fec24d04-7e30-407c-9ca8-79d9c198f63d /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Monday 16 June 2025 17:09:14 -0400 (0:00:02.211) 0:02:21.578 *********** ok: [managed-node11] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002993", "end": "2025-06-16 17:09:16.116220", "failed_when_result": false, "rc": 0, "start": "2025-06-16 17:09:16.113227" } STDOUT: luks-fec24d04-7e30-407c-9ca8-79d9c198f63d /dev/sda - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Monday 16 June 2025 17:09:16 -0400 (0:00:01.405) 0:02:22.983 *********** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Monday 16 June 2025 17:09:16 -0400 (0:00:00.197) 0:02:23.181 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node11 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Monday 16 June 2025 17:09:17 -0400 (0:00:00.481) 0:02:23.662 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Monday 16 June 2025 17:09:17 -0400 (0:00:00.282) 0:02:23.944 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node11 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Monday 16 June 2025 17:09:18 -0400 (0:00:01.356) 0:02:25.301 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-fec24d04-7e30-407c-9ca8-79d9c198f63d" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Monday 16 June 2025 17:09:18 -0400 (0:00:00.251) 0:02:25.553 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Monday 16 June 2025 17:09:19 -0400 (0:00:00.329) 0:02:25.882 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Monday 16 June 2025 17:09:19 -0400 (0:00:00.287) 0:02:26.170 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Monday 16 June 2025 17:09:19 -0400 (0:00:00.316) 0:02:26.486 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Monday 16 June 2025 17:09:20 -0400 (0:00:00.164) 0:02:26.650 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Monday 16 June 2025 17:09:20 -0400 (0:00:00.319) 0:02:26.970 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Monday 16 June 2025 17:09:20 -0400 (0:00:00.309) 0:02:27.279 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Monday 16 June 2025 17:09:20 -0400 (0:00:00.232) 0:02:27.512 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Monday 16 June 2025 17:09:21 -0400 (0:00:00.237) 0:02:27.749 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Monday 16 June 2025 17:09:21 -0400 (0:00:00.241) 0:02:27.991 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Monday 16 June 2025 17:09:21 -0400 (0:00:00.225) 0:02:28.216 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-fec24d04-7e30-407c-9ca8-79d9c198f63d " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Monday 16 June 2025 17:09:22 -0400 (0:00:00.419) 0:02:28.636 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Monday 16 June 2025 17:09:22 -0400 (0:00:00.341) 0:02:28.978 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Monday 16 June 2025 17:09:22 -0400 (0:00:00.260) 0:02:29.238 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Monday 16 June 2025 17:09:22 -0400 (0:00:00.336) 0:02:29.575 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Monday 16 June 2025 17:09:23 -0400 (0:00:00.352) 0:02:29.928 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Monday 16 June 2025 17:09:23 -0400 (0:00:00.163) 0:02:30.091 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Monday 16 June 2025 17:09:23 -0400 (0:00:00.348) 0:02:30.439 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Monday 16 June 2025 17:09:24 -0400 (0:00:00.298) 0:02:30.738 *********** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1750108128.935896, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1750108128.935896, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 36924, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1750108128.935896, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Monday 16 June 2025 17:09:25 -0400 (0:00:01.688) 0:02:32.427 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Monday 16 June 2025 17:09:26 -0400 (0:00:00.293) 0:02:32.720 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Monday 16 June 2025 17:09:26 -0400 (0:00:00.294) 0:02:33.015 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Monday 16 June 2025 17:09:26 -0400 (0:00:00.223) 0:02:33.239 *********** ok: [managed-node11] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Monday 16 June 2025 17:09:26 -0400 (0:00:00.283) 0:02:33.523 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Monday 16 June 2025 17:09:27 -0400 (0:00:00.340) 0:02:33.863 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Monday 16 June 2025 17:09:27 -0400 (0:00:00.282) 0:02:34.146 *********** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1750108129.064897, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1750108129.064897, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 144737, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1750108129.064897, "nlink": 1, "path": "/dev/mapper/luks-fec24d04-7e30-407c-9ca8-79d9c198f63d", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Monday 16 June 2025 17:09:29 -0400 (0:00:01.516) 0:02:35.662 *********** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Monday 16 June 2025 17:09:33 -0400 (0:00:04.468) 0:02:40.131 *********** ok: [managed-node11] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda" ], "delta": "0:00:00.012488", "end": "2025-06-16 17:09:34.565630", "rc": 0, "start": "2025-06-16 17:09:34.553142" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: fec24d04-7e30-407c-9ca8-79d9c198f63d Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 937728 Threads: 2 Salt: c1 5c ed 56 51 93 4b 49 b6 cf 99 26 40 f4 6f 4b 03 60 70 86 fa 2d 55 0d ef 3e 56 a4 a0 f2 28 a6 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 120249 Salt: fb 25 94 74 b4 24 8b 16 2f ea 2b 49 04 a9 85 bb 91 9f 92 24 74 01 45 e4 c4 d0 c6 64 f6 11 f9 0c Digest: 89 66 52 0c 3a 05 92 be 09 e9 46 e0 f8 59 ad 85 46 46 6e 9c 05 48 2d 2b ac 41 26 50 a1 df 3c 58 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Monday 16 June 2025 17:09:34 -0400 (0:00:01.317) 0:02:41.448 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Monday 16 June 2025 17:09:35 -0400 (0:00:00.324) 0:02:41.773 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Monday 16 June 2025 17:09:35 -0400 (0:00:00.286) 0:02:42.059 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Monday 16 June 2025 17:09:35 -0400 (0:00:00.222) 0:02:42.282 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Monday 16 June 2025 17:09:35 -0400 (0:00:00.143) 0:02:42.425 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Monday 16 June 2025 17:09:35 -0400 (0:00:00.188) 0:02:42.614 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Monday 16 June 2025 17:09:36 -0400 (0:00:00.076) 0:02:42.690 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Monday 16 June 2025 17:09:36 -0400 (0:00:00.088) 0:02:42.779 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-fec24d04-7e30-407c-9ca8-79d9c198f63d /dev/sda -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Monday 16 June 2025 17:09:36 -0400 (0:00:00.222) 0:02:43.001 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Monday 16 June 2025 17:09:36 -0400 (0:00:00.196) 0:02:43.198 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Monday 16 June 2025 17:09:36 -0400 (0:00:00.176) 0:02:43.374 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Monday 16 June 2025 17:09:37 -0400 (0:00:00.260) 0:02:43.635 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Monday 16 June 2025 17:09:37 -0400 (0:00:00.265) 0:02:43.900 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Monday 16 June 2025 17:09:37 -0400 (0:00:00.141) 0:02:44.041 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Monday 16 June 2025 17:09:37 -0400 (0:00:00.250) 0:02:44.292 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Monday 16 June 2025 17:09:37 -0400 (0:00:00.261) 0:02:44.554 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Monday 16 June 2025 17:09:38 -0400 (0:00:00.187) 0:02:44.741 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Monday 16 June 2025 17:09:38 -0400 (0:00:00.265) 0:02:45.007 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Monday 16 June 2025 17:09:38 -0400 (0:00:00.248) 0:02:45.255 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Monday 16 June 2025 17:09:38 -0400 (0:00:00.286) 0:02:45.541 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Monday 16 June 2025 17:09:39 -0400 (0:00:00.223) 0:02:45.765 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Monday 16 June 2025 17:09:39 -0400 (0:00:00.203) 0:02:45.968 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Monday 16 June 2025 17:09:39 -0400 (0:00:00.248) 0:02:46.217 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Monday 16 June 2025 17:09:39 -0400 (0:00:00.280) 0:02:46.498 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Monday 16 June 2025 17:09:40 -0400 (0:00:00.265) 0:02:46.763 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Monday 16 June 2025 17:09:40 -0400 (0:00:00.220) 0:02:46.984 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Monday 16 June 2025 17:09:40 -0400 (0:00:00.248) 0:02:47.233 *********** ok: [managed-node11] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Monday 16 June 2025 17:09:40 -0400 (0:00:00.171) 0:02:47.404 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Monday 16 June 2025 17:09:40 -0400 (0:00:00.174) 0:02:47.579 *********** skipping: [managed-node11] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Monday 16 June 2025 17:09:41 -0400 (0:00:00.202) 0:02:47.782 *********** skipping: [managed-node11] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Monday 16 June 2025 17:09:41 -0400 (0:00:00.248) 0:02:48.030 *********** skipping: [managed-node11] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Monday 16 June 2025 17:09:41 -0400 (0:00:00.280) 0:02:48.310 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Monday 16 June 2025 17:09:41 -0400 (0:00:00.238) 0:02:48.549 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Monday 16 June 2025 17:09:42 -0400 (0:00:00.288) 0:02:48.838 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Monday 16 June 2025 17:09:42 -0400 (0:00:00.352) 0:02:49.190 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Monday 16 June 2025 17:09:42 -0400 (0:00:00.254) 0:02:49.445 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Monday 16 June 2025 17:09:42 -0400 (0:00:00.183) 0:02:49.628 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Monday 16 June 2025 17:09:43 -0400 (0:00:00.248) 0:02:49.876 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Monday 16 June 2025 17:09:43 -0400 (0:00:00.175) 0:02:50.052 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Monday 16 June 2025 17:09:43 -0400 (0:00:00.202) 0:02:50.255 *********** skipping: [managed-node11] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Monday 16 June 2025 17:09:43 -0400 (0:00:00.204) 0:02:50.460 *********** skipping: [managed-node11] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Monday 16 June 2025 17:09:44 -0400 (0:00:00.221) 0:02:50.681 *********** skipping: [managed-node11] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Monday 16 June 2025 17:09:44 -0400 (0:00:00.233) 0:02:50.914 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Monday 16 June 2025 17:09:44 -0400 (0:00:00.248) 0:02:51.163 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Monday 16 June 2025 17:09:44 -0400 (0:00:00.193) 0:02:51.357 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Monday 16 June 2025 17:09:44 -0400 (0:00:00.268) 0:02:51.625 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Monday 16 June 2025 17:09:45 -0400 (0:00:00.285) 0:02:51.911 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Monday 16 June 2025 17:09:45 -0400 (0:00:00.176) 0:02:52.088 *********** ok: [managed-node11] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Monday 16 June 2025 17:09:45 -0400 (0:00:00.227) 0:02:52.316 *********** ok: [managed-node11] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Monday 16 June 2025 17:09:45 -0400 (0:00:00.188) 0:02:52.504 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Monday 16 June 2025 17:09:46 -0400 (0:00:00.174) 0:02:52.679 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Monday 16 June 2025 17:09:46 -0400 (0:00:00.195) 0:02:52.874 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Monday 16 June 2025 17:09:46 -0400 (0:00:00.217) 0:02:53.091 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Monday 16 June 2025 17:09:46 -0400 (0:00:00.143) 0:02:53.235 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Monday 16 June 2025 17:09:46 -0400 (0:00:00.324) 0:02:53.560 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Monday 16 June 2025 17:09:47 -0400 (0:00:00.242) 0:02:53.802 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Monday 16 June 2025 17:09:47 -0400 (0:00:00.299) 0:02:54.101 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Monday 16 June 2025 17:09:47 -0400 (0:00:00.223) 0:02:54.325 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Monday 16 June 2025 17:09:47 -0400 (0:00:00.168) 0:02:54.493 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Monday 16 June 2025 17:09:47 -0400 (0:00:00.128) 0:02:54.621 *********** changed: [managed-node11] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:120 Monday 16 June 2025 17:09:50 -0400 (0:00:02.584) 0:02:57.206 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node11 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Monday 16 June 2025 17:09:51 -0400 (0:00:00.892) 0:02:58.098 *********** ok: [managed-node11] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Monday 16 June 2025 17:09:51 -0400 (0:00:00.315) 0:02:58.413 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 16 June 2025 17:09:52 -0400 (0:00:00.300) 0:02:58.714 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 16 June 2025 17:09:52 -0400 (0:00:00.323) 0:02:59.038 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 16 June 2025 17:09:52 -0400 (0:00:00.255) 0:02:59.293 *********** skipping: [managed-node11] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node11] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Monday 16 June 2025 17:09:53 -0400 (0:00:00.501) 0:02:59.795 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Monday 16 June 2025 17:09:53 -0400 (0:00:00.192) 0:02:59.988 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 16 June 2025 17:09:53 -0400 (0:00:00.197) 0:03:00.186 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 16 June 2025 17:09:53 -0400 (0:00:00.164) 0:03:00.350 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 16 June 2025 17:09:53 -0400 (0:00:00.262) 0:03:00.613 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Monday 16 June 2025 17:09:54 -0400 (0:00:00.736) 0:03:01.349 *********** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 16 June 2025 17:09:59 -0400 (0:00:04.756) 0:03:06.106 *********** ok: [managed-node11] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Monday 16 June 2025 17:09:59 -0400 (0:00:00.262) 0:03:06.369 *********** ok: [managed-node11] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": false, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Monday 16 June 2025 17:10:00 -0400 (0:00:00.342) 0:03:06.711 *********** ok: [managed-node11] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 16 June 2025 17:10:05 -0400 (0:00:05.621) 0:03:12.333 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 16 June 2025 17:10:06 -0400 (0:00:00.398) 0:03:12.731 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 16 June 2025 17:10:06 -0400 (0:00:00.229) 0:03:12.961 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 16 June 2025 17:10:06 -0400 (0:00:00.226) 0:03:13.187 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Monday 16 June 2025 17:10:06 -0400 (0:00:00.201) 0:03:13.389 *********** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Monday 16 June 2025 17:10:11 -0400 (0:00:04.659) 0:03:18.049 *********** ok: [managed-node11] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Monday 16 June 2025 17:10:14 -0400 (0:00:03.145) 0:03:21.194 *********** ok: [managed-node11] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Monday 16 June 2025 17:10:14 -0400 (0:00:00.319) 0:03:21.513 *********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Monday 16 June 2025 17:10:15 -0400 (0:00:00.211) 0:03:21.724 *********** fatal: [managed-node11]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-fec24d04-7e30-407c-9ca8-79d9c198f63d' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Monday 16 June 2025 17:10:20 -0400 (0:00:05.385) 0:03:27.110 *********** fatal: [managed-node11]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'luks-fec24d04-7e30-407c-9ca8-79d9c198f63d' in safe mode due to encryption removal", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [], 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10720641024, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 16 June 2025 17:10:20 -0400 (0:00:00.223) 0:03:27.334 *********** TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Monday 16 June 2025 17:10:20 -0400 (0:00:00.248) 0:03:27.582 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Monday 16 June 2025 17:10:21 -0400 (0:00:00.284) 0:03:27.867 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Monday 16 June 2025 17:10:21 -0400 (0:00:00.331) 0:03:28.198 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Monday 16 June 2025 17:10:21 -0400 (0:00:00.192) 0:03:28.391 *********** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1750108190.3113904, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1750108190.3113904, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1750108190.3113904, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "2880238312", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Monday 16 June 2025 17:10:23 -0400 (0:00:01.442) 0:03:29.833 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:141 Monday 16 June 2025 17:10:23 -0400 (0:00:00.242) 0:03:30.075 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 16 June 2025 17:10:24 -0400 (0:00:00.685) 0:03:30.761 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 16 June 2025 17:10:24 -0400 (0:00:00.430) 0:03:31.192 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 16 June 2025 17:10:24 -0400 (0:00:00.259) 0:03:31.451 *********** skipping: [managed-node11] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node11] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Monday 16 June 2025 17:10:25 -0400 (0:00:00.591) 0:03:32.042 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Monday 16 June 2025 17:10:25 -0400 (0:00:00.323) 0:03:32.366 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 16 June 2025 17:10:26 -0400 (0:00:00.299) 0:03:32.666 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 16 June 2025 17:10:26 -0400 (0:00:00.177) 0:03:32.843 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 16 June 2025 17:10:26 -0400 (0:00:00.216) 0:03:33.059 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Monday 16 June 2025 17:10:26 -0400 (0:00:00.529) 0:03:33.589 *********** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 16 June 2025 17:10:31 -0400 (0:00:04.502) 0:03:38.092 *********** ok: [managed-node11] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Monday 16 June 2025 17:10:31 -0400 (0:00:00.249) 0:03:38.341 *********** ok: [managed-node11] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": false, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Monday 16 June 2025 17:10:31 -0400 (0:00:00.226) 0:03:38.568 *********** ok: [managed-node11] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 16 June 2025 17:10:37 -0400 (0:00:05.097) 0:03:43.665 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 16 June 2025 17:10:37 -0400 (0:00:00.475) 0:03:44.140 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 16 June 2025 17:10:37 -0400 (0:00:00.228) 0:03:44.369 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 16 June 2025 17:10:38 -0400 (0:00:00.312) 0:03:44.681 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Monday 16 June 2025 17:10:38 -0400 (0:00:00.168) 0:03:44.850 *********** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Monday 16 June 2025 17:10:42 -0400 (0:00:04.733) 0:03:49.584 *********** ok: [managed-node11] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Monday 16 June 2025 17:10:45 -0400 (0:00:02.541) 0:03:52.125 *********** ok: [managed-node11] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Monday 16 June 2025 17:10:45 -0400 (0:00:00.286) 0:03:52.412 *********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Monday 16 June 2025 17:10:45 -0400 (0:00:00.153) 0:03:52.566 *********** changed: [managed-node11] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-fec24d04-7e30-407c-9ca8-79d9c198f63d", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-fec24d04-7e30-407c-9ca8-79d9c198f63d", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-fec24d04-7e30-407c-9ca8-79d9c198f63d", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-fec24d04-7e30-407c-9ca8-79d9c198f63d", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=088da657-9a96-4542-b1c5-ff959fc7d8de", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=088da657-9a96-4542-b1c5-ff959fc7d8de", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Monday 16 June 2025 17:10:50 -0400 (0:00:05.048) 0:03:57.615 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Monday 16 June 2025 17:10:51 -0400 (0:00:00.244) 0:03:57.859 *********** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1750108141.9440007, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "d991fff1fa6855fb13c252101ce5678c80950dc4", "ctime": 1750108141.9410007, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 364904586, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1750108141.9410007, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "4217545767", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Monday 16 June 2025 17:10:53 -0400 (0:00:01.809) 0:03:59.668 *********** ok: [managed-node11] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 16 June 2025 17:10:54 -0400 (0:00:01.694) 0:04:01.362 *********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Monday 16 June 2025 17:10:54 -0400 (0:00:00.255) 0:04:01.618 *********** ok: [managed-node11] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-fec24d04-7e30-407c-9ca8-79d9c198f63d", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-fec24d04-7e30-407c-9ca8-79d9c198f63d", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-fec24d04-7e30-407c-9ca8-79d9c198f63d", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-fec24d04-7e30-407c-9ca8-79d9c198f63d", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=088da657-9a96-4542-b1c5-ff959fc7d8de", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=088da657-9a96-4542-b1c5-ff959fc7d8de", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Monday 16 June 2025 17:10:55 -0400 (0:00:00.281) 0:04:01.900 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Monday 16 June 2025 17:10:55 -0400 (0:00:00.349) 0:04:02.249 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=088da657-9a96-4542-b1c5-ff959fc7d8de", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Monday 16 June 2025 17:10:55 -0400 (0:00:00.254) 0:04:02.504 *********** changed: [managed-node11] => (item={'src': '/dev/mapper/luks-fec24d04-7e30-407c-9ca8-79d9c198f63d', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-fec24d04-7e30-407c-9ca8-79d9c198f63d", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-fec24d04-7e30-407c-9ca8-79d9c198f63d" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Monday 16 June 2025 17:10:57 -0400 (0:00:01.445) 0:04:03.950 *********** ok: [managed-node11] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Monday 16 June 2025 17:10:58 -0400 (0:00:01.422) 0:04:05.373 *********** changed: [managed-node11] => (item={'src': 'UUID=088da657-9a96-4542-b1c5-ff959fc7d8de', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=088da657-9a96-4542-b1c5-ff959fc7d8de", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=088da657-9a96-4542-b1c5-ff959fc7d8de" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Monday 16 June 2025 17:10:59 -0400 (0:00:01.120) 0:04:06.493 *********** skipping: [managed-node11] => (item={'src': 'UUID=088da657-9a96-4542-b1c5-ff959fc7d8de', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=088da657-9a96-4542-b1c5-ff959fc7d8de", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Monday 16 June 2025 17:11:00 -0400 (0:00:00.192) 0:04:06.686 *********** ok: [managed-node11] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Monday 16 June 2025 17:11:01 -0400 (0:00:01.583) 0:04:08.270 *********** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1750108156.115115, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "e2a5647237cfde9f99f1d54f86fd2c6da7784384", "ctime": 1750108147.3680444, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 218104021, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1750108147.3670444, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 53, "uid": 0, "version": "3231173902", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Monday 16 June 2025 17:11:03 -0400 (0:00:01.445) 0:04:09.716 *********** changed: [managed-node11] => (item={'backing_device': '/dev/sda', 'name': 'luks-fec24d04-7e30-407c-9ca8-79d9c198f63d', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-fec24d04-7e30-407c-9ca8-79d9c198f63d", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Monday 16 June 2025 17:11:04 -0400 (0:00:01.444) 0:04:11.160 *********** ok: [managed-node11] TASK [Verify role results] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:155 Monday 16 June 2025 17:11:06 -0400 (0:00:01.942) 0:04:13.103 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node11 TASK [Print out pool information] ********************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Monday 16 June 2025 17:11:07 -0400 (0:00:00.556) 0:04:13.660 *********** skipping: [managed-node11] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Monday 16 June 2025 17:11:07 -0400 (0:00:00.288) 0:04:13.949 *********** ok: [managed-node11] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=088da657-9a96-4542-b1c5-ff959fc7d8de", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Monday 16 June 2025 17:11:07 -0400 (0:00:00.304) 0:04:14.254 *********** ok: [managed-node11] => { "changed": false, "info": { "/dev/sda": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "088da657-9a96-4542-b1c5-ff959fc7d8de" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Monday 16 June 2025 17:11:09 -0400 (0:00:01.559) 0:04:15.813 *********** ok: [managed-node11] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002638", "end": "2025-06-16 17:11:10.352830", "rc": 0, "start": "2025-06-16 17:11:10.350192" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 UUID=088da657-9a96-4542-b1c5-ff959fc7d8de /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Monday 16 June 2025 17:11:10 -0400 (0:00:01.460) 0:04:17.274 *********** ok: [managed-node11] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003252", "end": "2025-06-16 17:11:11.841085", "failed_when_result": false, "rc": 0, "start": "2025-06-16 17:11:11.837833" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Monday 16 June 2025 17:11:12 -0400 (0:00:01.488) 0:04:18.762 *********** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Monday 16 June 2025 17:11:12 -0400 (0:00:00.232) 0:04:18.994 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node11 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Monday 16 June 2025 17:11:12 -0400 (0:00:00.542) 0:04:19.537 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Monday 16 June 2025 17:11:13 -0400 (0:00:00.225) 0:04:19.762 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node11 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Monday 16 June 2025 17:11:14 -0400 (0:00:01.245) 0:04:21.007 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Monday 16 June 2025 17:11:14 -0400 (0:00:00.442) 0:04:21.450 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Monday 16 June 2025 17:11:15 -0400 (0:00:00.339) 0:04:21.789 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Monday 16 June 2025 17:11:15 -0400 (0:00:00.346) 0:04:22.135 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Monday 16 June 2025 17:11:15 -0400 (0:00:00.323) 0:04:22.459 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Monday 16 June 2025 17:11:16 -0400 (0:00:00.296) 0:04:22.756 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Monday 16 June 2025 17:11:16 -0400 (0:00:00.292) 0:04:23.048 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Monday 16 June 2025 17:11:16 -0400 (0:00:00.330) 0:04:23.379 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Monday 16 June 2025 17:11:17 -0400 (0:00:00.385) 0:04:23.764 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Monday 16 June 2025 17:11:17 -0400 (0:00:00.257) 0:04:24.021 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Monday 16 June 2025 17:11:17 -0400 (0:00:00.235) 0:04:24.257 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Monday 16 June 2025 17:11:17 -0400 (0:00:00.197) 0:04:24.455 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=088da657-9a96-4542-b1c5-ff959fc7d8de " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Monday 16 June 2025 17:11:18 -0400 (0:00:00.697) 0:04:25.153 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Monday 16 June 2025 17:11:18 -0400 (0:00:00.303) 0:04:25.456 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Monday 16 June 2025 17:11:19 -0400 (0:00:00.244) 0:04:25.701 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Monday 16 June 2025 17:11:19 -0400 (0:00:00.301) 0:04:26.002 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Monday 16 June 2025 17:11:19 -0400 (0:00:00.325) 0:04:26.328 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Monday 16 June 2025 17:11:19 -0400 (0:00:00.146) 0:04:26.474 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Monday 16 June 2025 17:11:20 -0400 (0:00:00.319) 0:04:26.793 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Monday 16 June 2025 17:11:20 -0400 (0:00:00.309) 0:04:27.103 *********** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1750108250.5618756, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1750108250.5618756, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 36924, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1750108250.5618756, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Monday 16 June 2025 17:11:21 -0400 (0:00:01.518) 0:04:28.622 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Monday 16 June 2025 17:11:22 -0400 (0:00:00.174) 0:04:28.796 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Monday 16 June 2025 17:11:22 -0400 (0:00:00.160) 0:04:28.957 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Monday 16 June 2025 17:11:22 -0400 (0:00:00.208) 0:04:29.165 *********** ok: [managed-node11] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Monday 16 June 2025 17:11:22 -0400 (0:00:00.193) 0:04:29.359 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Monday 16 June 2025 17:11:22 -0400 (0:00:00.230) 0:04:29.590 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Monday 16 June 2025 17:11:23 -0400 (0:00:00.232) 0:04:29.822 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Monday 16 June 2025 17:11:23 -0400 (0:00:00.200) 0:04:30.023 *********** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Monday 16 June 2025 17:11:28 -0400 (0:00:04.732) 0:04:34.755 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Monday 16 June 2025 17:11:28 -0400 (0:00:00.218) 0:04:34.973 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Monday 16 June 2025 17:11:28 -0400 (0:00:00.271) 0:04:35.244 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Monday 16 June 2025 17:11:29 -0400 (0:00:00.397) 0:04:35.642 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Monday 16 June 2025 17:11:29 -0400 (0:00:00.330) 0:04:35.973 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Monday 16 June 2025 17:11:29 -0400 (0:00:00.282) 0:04:36.255 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Monday 16 June 2025 17:11:29 -0400 (0:00:00.352) 0:04:36.608 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Monday 16 June 2025 17:11:30 -0400 (0:00:00.241) 0:04:36.849 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Monday 16 June 2025 17:11:30 -0400 (0:00:00.277) 0:04:37.126 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Monday 16 June 2025 17:11:30 -0400 (0:00:00.287) 0:04:37.413 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Monday 16 June 2025 17:11:31 -0400 (0:00:00.283) 0:04:37.697 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Monday 16 June 2025 17:11:31 -0400 (0:00:00.208) 0:04:37.905 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Monday 16 June 2025 17:11:31 -0400 (0:00:00.157) 0:04:38.063 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Monday 16 June 2025 17:11:31 -0400 (0:00:00.212) 0:04:38.275 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Monday 16 June 2025 17:11:31 -0400 (0:00:00.200) 0:04:38.476 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Monday 16 June 2025 17:11:32 -0400 (0:00:00.272) 0:04:38.748 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Monday 16 June 2025 17:11:32 -0400 (0:00:00.264) 0:04:39.013 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Monday 16 June 2025 17:11:32 -0400 (0:00:00.280) 0:04:39.293 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Monday 16 June 2025 17:11:32 -0400 (0:00:00.261) 0:04:39.555 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Monday 16 June 2025 17:11:33 -0400 (0:00:00.264) 0:04:39.819 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Monday 16 June 2025 17:11:33 -0400 (0:00:00.322) 0:04:40.142 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Monday 16 June 2025 17:11:33 -0400 (0:00:00.345) 0:04:40.488 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Monday 16 June 2025 17:11:34 -0400 (0:00:00.274) 0:04:40.762 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Monday 16 June 2025 17:11:34 -0400 (0:00:00.119) 0:04:40.881 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Monday 16 June 2025 17:11:34 -0400 (0:00:00.257) 0:04:41.138 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Monday 16 June 2025 17:11:34 -0400 (0:00:00.305) 0:04:41.444 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Monday 16 June 2025 17:11:35 -0400 (0:00:00.341) 0:04:41.785 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Monday 16 June 2025 17:11:35 -0400 (0:00:00.249) 0:04:42.034 *********** ok: [managed-node11] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Monday 16 June 2025 17:11:35 -0400 (0:00:00.321) 0:04:42.356 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Monday 16 June 2025 17:11:35 -0400 (0:00:00.224) 0:04:42.580 *********** skipping: [managed-node11] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Monday 16 June 2025 17:11:36 -0400 (0:00:00.218) 0:04:42.799 *********** skipping: [managed-node11] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Monday 16 June 2025 17:11:36 -0400 (0:00:00.151) 0:04:42.950 *********** skipping: [managed-node11] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Monday 16 June 2025 17:11:36 -0400 (0:00:00.119) 0:04:43.069 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Monday 16 June 2025 17:11:36 -0400 (0:00:00.089) 0:04:43.159 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Monday 16 June 2025 17:11:36 -0400 (0:00:00.118) 0:04:43.277 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Monday 16 June 2025 17:11:36 -0400 (0:00:00.179) 0:04:43.457 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Monday 16 June 2025 17:11:37 -0400 (0:00:00.219) 0:04:43.677 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Monday 16 June 2025 17:11:37 -0400 (0:00:00.188) 0:04:43.866 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Monday 16 June 2025 17:11:37 -0400 (0:00:00.265) 0:04:44.132 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Monday 16 June 2025 17:11:37 -0400 (0:00:00.257) 0:04:44.389 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Monday 16 June 2025 17:11:37 -0400 (0:00:00.170) 0:04:44.560 *********** skipping: [managed-node11] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Monday 16 June 2025 17:11:38 -0400 (0:00:00.291) 0:04:44.851 *********** skipping: [managed-node11] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Monday 16 June 2025 17:11:38 -0400 (0:00:00.199) 0:04:45.051 *********** skipping: [managed-node11] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Monday 16 June 2025 17:11:38 -0400 (0:00:00.202) 0:04:45.254 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Monday 16 June 2025 17:11:38 -0400 (0:00:00.263) 0:04:45.517 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Monday 16 June 2025 17:11:39 -0400 (0:00:00.227) 0:04:45.745 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Monday 16 June 2025 17:11:39 -0400 (0:00:00.262) 0:04:46.008 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Monday 16 June 2025 17:11:39 -0400 (0:00:00.237) 0:04:46.246 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Monday 16 June 2025 17:11:39 -0400 (0:00:00.178) 0:04:46.424 *********** ok: [managed-node11] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Monday 16 June 2025 17:11:40 -0400 (0:00:00.278) 0:04:46.703 *********** ok: [managed-node11] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Monday 16 June 2025 17:11:40 -0400 (0:00:00.265) 0:04:46.969 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Monday 16 June 2025 17:11:40 -0400 (0:00:00.336) 0:04:47.305 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Monday 16 June 2025 17:11:40 -0400 (0:00:00.249) 0:04:47.554 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Monday 16 June 2025 17:11:41 -0400 (0:00:00.259) 0:04:47.814 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Monday 16 June 2025 17:11:41 -0400 (0:00:00.215) 0:04:48.030 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Monday 16 June 2025 17:11:41 -0400 (0:00:00.248) 0:04:48.278 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Monday 16 June 2025 17:11:42 -0400 (0:00:00.612) 0:04:48.890 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Monday 16 June 2025 17:11:42 -0400 (0:00:00.273) 0:04:49.164 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Monday 16 June 2025 17:11:42 -0400 (0:00:00.283) 0:04:49.448 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Monday 16 June 2025 17:11:42 -0400 (0:00:00.151) 0:04:49.599 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Monday 16 June 2025 17:11:43 -0400 (0:00:00.241) 0:04:49.840 *********** changed: [managed-node11] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:161 Monday 16 June 2025 17:11:44 -0400 (0:00:01.626) 0:04:51.467 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node11 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Monday 16 June 2025 17:11:45 -0400 (0:00:00.636) 0:04:52.104 *********** ok: [managed-node11] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Monday 16 June 2025 17:11:45 -0400 (0:00:00.225) 0:04:52.329 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 16 June 2025 17:11:46 -0400 (0:00:00.434) 0:04:52.764 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 16 June 2025 17:11:46 -0400 (0:00:00.380) 0:04:53.144 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 16 June 2025 17:11:46 -0400 (0:00:00.264) 0:04:53.409 *********** skipping: [managed-node11] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node11] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Monday 16 June 2025 17:11:47 -0400 (0:00:00.694) 0:04:54.104 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Monday 16 June 2025 17:11:47 -0400 (0:00:00.263) 0:04:54.368 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 16 June 2025 17:11:48 -0400 (0:00:00.261) 0:04:54.630 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 16 June 2025 17:11:48 -0400 (0:00:00.139) 0:04:54.769 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 16 June 2025 17:11:48 -0400 (0:00:00.257) 0:04:55.026 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Monday 16 June 2025 17:11:48 -0400 (0:00:00.571) 0:04:55.598 *********** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 16 June 2025 17:11:53 -0400 (0:00:04.781) 0:05:00.380 *********** ok: [managed-node11] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Monday 16 June 2025 17:11:54 -0400 (0:00:00.333) 0:05:00.714 *********** ok: [managed-node11] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Monday 16 June 2025 17:11:54 -0400 (0:00:00.253) 0:05:00.967 *********** ok: [managed-node11] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 16 June 2025 17:11:59 -0400 (0:00:05.591) 0:05:06.558 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 16 June 2025 17:12:00 -0400 (0:00:00.486) 0:05:07.045 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 16 June 2025 17:12:00 -0400 (0:00:00.350) 0:05:07.396 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 16 June 2025 17:12:01 -0400 (0:00:00.404) 0:05:07.801 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Monday 16 June 2025 17:12:01 -0400 (0:00:00.229) 0:05:08.031 *********** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Monday 16 June 2025 17:12:06 -0400 (0:00:05.067) 0:05:13.098 *********** ok: [managed-node11] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...d7e30\\x2d407c\\x2d9ca8\\x2d79d9c198f63d.service": { "name": "systemd-cryptsetup@luk...d7e30\\x2d407c\\x2d9ca8\\x2d79d9c198f63d.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2dfec24d04\\x2d7e30\\x2d407c\\x2d9ca8\\x2d79d9c198f63d.service": { "name": "systemd-cryptsetup@luks\\x2dfec24d04\\x2d7e30\\x2d407c\\x2d9ca8\\x2d79d9c198f63d.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Monday 16 June 2025 17:12:09 -0400 (0:00:02.946) 0:05:16.045 *********** ok: [managed-node11] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2dfec24d04\\x2d7e30\\x2d407c\\x2d9ca8\\x2d79d9c198f63d.service", "systemd-cryptsetup@luk...d7e30\\x2d407c\\x2d9ca8\\x2d79d9c198f63d.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Monday 16 June 2025 17:12:09 -0400 (0:00:00.288) 0:05:16.333 *********** changed: [managed-node11] => (item=systemd-cryptsetup@luks\x2dfec24d04\x2d7e30\x2d407c\x2d9ca8\x2d79d9c198f63d.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dfec24d04\\x2d7e30\\x2d407c\\x2d9ca8\\x2d79d9c198f63d.service", "name": "systemd-cryptsetup@luks\\x2dfec24d04\\x2d7e30\\x2d407c\\x2d9ca8\\x2d79d9c198f63d.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "cryptsetup-pre.target dev-sda.device systemd-journald.socket system-systemd\\x2dcryptsetup.slice", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-sda.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-fec24d04-7e30-407c-9ca8-79d9c198f63d", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-fec24d04-7e30-407c-9ca8-79d9c198f63d /dev/sda - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-fec24d04-7e30-407c-9ca8-79d9c198f63d ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2dfec24d04\\x2d7e30\\x2d407c\\x2d9ca8\\x2d79d9c198f63d.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2dfec24d04\\x2d7e30\\x2d407c\\x2d9ca8\\x2d79d9c198f63d.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2dfec24d04\\x2d7e30\\x2d407c\\x2d9ca8\\x2d79d9c198f63d.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Mon 2025-06-16 17:11:01 EDT", "StateChangeTimestampMonotonic": "1897347984", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-sda.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node11] => (item=systemd-cryptsetup@luk...d7e30\x2d407c\x2d9ca8\x2d79d9c198f63d.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d7e30\\x2d407c\\x2d9ca8\\x2d79d9c198f63d.service", "name": "systemd-cryptsetup@luk...d7e30\\x2d407c\\x2d9ca8\\x2d79d9c198f63d.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d7e30\\x2d407c\\x2d9ca8\\x2d79d9c198f63d.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d7e30\\x2d407c\\x2d9ca8\\x2d79d9c198f63d.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...d7e30\\x2d407c\\x2d9ca8\\x2d79d9c198f63d.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d7e30\\x2d407c\\x2d9ca8\\x2d79d9c198f63d.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Monday 16 June 2025 17:12:12 -0400 (0:00:03.196) 0:05:19.530 *********** fatal: [managed-node11]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'sda' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Monday 16 June 2025 17:12:18 -0400 (0:00:05.513) 0:05:25.043 *********** fatal: [managed-node11]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'sda' in safe mode due to adding encryption", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [], 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10737418240, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 16 June 2025 17:12:18 -0400 (0:00:00.246) 0:05:25.290 *********** changed: [managed-node11] => (item=systemd-cryptsetup@luks\x2dfec24d04\x2d7e30\x2d407c\x2d9ca8\x2d79d9c198f63d.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dfec24d04\\x2d7e30\\x2d407c\\x2d9ca8\\x2d79d9c198f63d.service", "name": "systemd-cryptsetup@luks\\x2dfec24d04\\x2d7e30\\x2d407c\\x2d9ca8\\x2d79d9c198f63d.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2dfec24d04\\x2d7e30\\x2d407c\\x2d9ca8\\x2d79d9c198f63d.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2dfec24d04\\x2d7e30\\x2d407c\\x2d9ca8\\x2d79d9c198f63d.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2dfec24d04\\x2d7e30\\x2d407c\\x2d9ca8\\x2d79d9c198f63d.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2dfec24d04\\x2d7e30\\x2d407c\\x2d9ca8\\x2d79d9c198f63d.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node11] => (item=systemd-cryptsetup@luk...d7e30\x2d407c\x2d9ca8\x2d79d9c198f63d.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d7e30\\x2d407c\\x2d9ca8\\x2d79d9c198f63d.service", "name": "systemd-cryptsetup@luk...d7e30\\x2d407c\\x2d9ca8\\x2d79d9c198f63d.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d7e30\\x2d407c\\x2d9ca8\\x2d79d9c198f63d.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d7e30\\x2d407c\\x2d9ca8\\x2d79d9c198f63d.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...d7e30\\x2d407c\\x2d9ca8\\x2d79d9c198f63d.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d7e30\\x2d407c\\x2d9ca8\\x2d79d9c198f63d.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Monday 16 June 2025 17:12:22 -0400 (0:00:03.498) 0:05:28.789 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Monday 16 June 2025 17:12:22 -0400 (0:00:00.330) 0:05:29.120 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Monday 16 June 2025 17:12:22 -0400 (0:00:00.434) 0:05:29.554 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Monday 16 June 2025 17:12:23 -0400 (0:00:00.309) 0:05:29.864 *********** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1750108304.5263085, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1750108304.5263085, "dev": 2048, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1750108304.5263085, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "3443974010", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Monday 16 June 2025 17:12:24 -0400 (0:00:01.458) 0:05:31.322 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Add encryption to the volume] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:182 Monday 16 June 2025 17:12:25 -0400 (0:00:00.326) 0:05:31.649 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 16 June 2025 17:12:25 -0400 (0:00:00.647) 0:05:32.297 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 16 June 2025 17:12:26 -0400 (0:00:00.392) 0:05:32.690 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 16 June 2025 17:12:26 -0400 (0:00:00.247) 0:05:32.938 *********** skipping: [managed-node11] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node11] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Monday 16 June 2025 17:12:26 -0400 (0:00:00.593) 0:05:33.531 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Monday 16 June 2025 17:12:27 -0400 (0:00:00.232) 0:05:33.764 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 16 June 2025 17:12:27 -0400 (0:00:00.246) 0:05:34.011 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 16 June 2025 17:12:27 -0400 (0:00:00.283) 0:05:34.294 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 16 June 2025 17:12:27 -0400 (0:00:00.183) 0:05:34.477 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Monday 16 June 2025 17:12:28 -0400 (0:00:00.612) 0:05:35.090 *********** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 16 June 2025 17:12:33 -0400 (0:00:05.478) 0:05:40.569 *********** ok: [managed-node11] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Monday 16 June 2025 17:12:34 -0400 (0:00:00.283) 0:05:40.852 *********** ok: [managed-node11] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Monday 16 June 2025 17:12:34 -0400 (0:00:00.309) 0:05:41.162 *********** ok: [managed-node11] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 16 June 2025 17:12:39 -0400 (0:00:05.355) 0:05:46.518 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 16 June 2025 17:12:40 -0400 (0:00:00.428) 0:05:46.946 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 16 June 2025 17:12:40 -0400 (0:00:00.176) 0:05:47.122 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 16 June 2025 17:12:40 -0400 (0:00:00.301) 0:05:47.424 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Monday 16 June 2025 17:12:40 -0400 (0:00:00.202) 0:05:47.626 *********** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Monday 16 June 2025 17:12:45 -0400 (0:00:04.413) 0:05:52.040 *********** ok: [managed-node11] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Monday 16 June 2025 17:12:48 -0400 (0:00:02.991) 0:05:55.031 *********** ok: [managed-node11] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Monday 16 June 2025 17:12:48 -0400 (0:00:00.293) 0:05:55.325 *********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Monday 16 June 2025 17:12:48 -0400 (0:00:00.231) 0:05:55.556 *********** changed: [managed-node11] => { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-3b147b0f-a34f-43bf-a4ef-e30d56b04273", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-3b147b0f-a34f-43bf-a4ef-e30d56b04273", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-3b147b0f-a34f-43bf-a4ef-e30d56b04273", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-3b147b0f-a34f-43bf-a4ef-e30d56b04273" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=088da657-9a96-4542-b1c5-ff959fc7d8de", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-3b147b0f-a34f-43bf-a4ef-e30d56b04273", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-3b147b0f-a34f-43bf-a4ef-e30d56b04273", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-3b147b0f-a34f-43bf-a4ef-e30d56b04273", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Monday 16 June 2025 17:13:02 -0400 (0:00:13.331) 0:06:08.887 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Monday 16 June 2025 17:13:02 -0400 (0:00:00.272) 0:06:09.160 *********** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1750108259.676949, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "554923f75c60d0e61fe11264b776828da362d305", "ctime": 1750108259.6739488, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 364904586, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1750108259.6739488, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1413, "uid": 0, "version": "4217545767", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Monday 16 June 2025 17:13:03 -0400 (0:00:01.236) 0:06:10.397 *********** ok: [managed-node11] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 16 June 2025 17:13:05 -0400 (0:00:01.614) 0:06:12.011 *********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Monday 16 June 2025 17:13:05 -0400 (0:00:00.249) 0:06:12.260 *********** ok: [managed-node11] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-3b147b0f-a34f-43bf-a4ef-e30d56b04273", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-3b147b0f-a34f-43bf-a4ef-e30d56b04273", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-3b147b0f-a34f-43bf-a4ef-e30d56b04273", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-3b147b0f-a34f-43bf-a4ef-e30d56b04273" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=088da657-9a96-4542-b1c5-ff959fc7d8de", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-3b147b0f-a34f-43bf-a4ef-e30d56b04273", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-3b147b0f-a34f-43bf-a4ef-e30d56b04273", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-3b147b0f-a34f-43bf-a4ef-e30d56b04273", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Monday 16 June 2025 17:13:05 -0400 (0:00:00.290) 0:06:12.551 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Monday 16 June 2025 17:13:06 -0400 (0:00:00.283) 0:06:12.835 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-3b147b0f-a34f-43bf-a4ef-e30d56b04273", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-3b147b0f-a34f-43bf-a4ef-e30d56b04273", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Monday 16 June 2025 17:13:06 -0400 (0:00:00.201) 0:06:13.036 *********** changed: [managed-node11] => (item={'src': 'UUID=088da657-9a96-4542-b1c5-ff959fc7d8de', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=088da657-9a96-4542-b1c5-ff959fc7d8de", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=088da657-9a96-4542-b1c5-ff959fc7d8de" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Monday 16 June 2025 17:13:07 -0400 (0:00:01.449) 0:06:14.486 *********** ok: [managed-node11] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Monday 16 June 2025 17:13:09 -0400 (0:00:01.640) 0:06:16.126 *********** changed: [managed-node11] => (item={'src': '/dev/mapper/luks-3b147b0f-a34f-43bf-a4ef-e30d56b04273', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-3b147b0f-a34f-43bf-a4ef-e30d56b04273", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-3b147b0f-a34f-43bf-a4ef-e30d56b04273" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Monday 16 June 2025 17:13:11 -0400 (0:00:01.593) 0:06:17.720 *********** skipping: [managed-node11] => (item={'src': '/dev/mapper/luks-3b147b0f-a34f-43bf-a4ef-e30d56b04273', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-3b147b0f-a34f-43bf-a4ef-e30d56b04273", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Monday 16 June 2025 17:13:11 -0400 (0:00:00.335) 0:06:18.056 *********** ok: [managed-node11] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Monday 16 June 2025 17:13:13 -0400 (0:00:01.856) 0:06:19.912 *********** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1750108271.8400464, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1750108264.2609856, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 337641618, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1750108264.2599857, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4264263251", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Monday 16 June 2025 17:13:14 -0400 (0:00:01.571) 0:06:21.484 *********** changed: [managed-node11] => (item={'backing_device': '/dev/sda', 'name': 'luks-3b147b0f-a34f-43bf-a4ef-e30d56b04273', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-3b147b0f-a34f-43bf-a4ef-e30d56b04273", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Monday 16 June 2025 17:13:16 -0400 (0:00:01.574) 0:06:23.058 *********** ok: [managed-node11] TASK [Verify role results] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:196 Monday 16 June 2025 17:13:18 -0400 (0:00:01.802) 0:06:24.861 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node11 TASK [Print out pool information] ********************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Monday 16 June 2025 17:13:18 -0400 (0:00:00.432) 0:06:25.294 *********** skipping: [managed-node11] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Monday 16 June 2025 17:13:18 -0400 (0:00:00.237) 0:06:25.532 *********** ok: [managed-node11] => { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-3b147b0f-a34f-43bf-a4ef-e30d56b04273", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-3b147b0f-a34f-43bf-a4ef-e30d56b04273", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Monday 16 June 2025 17:13:19 -0400 (0:00:00.265) 0:06:25.797 *********** ok: [managed-node11] => { "changed": false, "info": { "/dev/mapper/luks-3b147b0f-a34f-43bf-a4ef-e30d56b04273": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-3b147b0f-a34f-43bf-a4ef-e30d56b04273", "size": "10G", "type": "crypt", "uuid": "897fdd03-0ca9-46d1-b69f-2046d3ed5704" }, "/dev/sda": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "3b147b0f-a34f-43bf-a4ef-e30d56b04273" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Monday 16 June 2025 17:13:20 -0400 (0:00:01.260) 0:06:27.057 *********** ok: [managed-node11] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002383", "end": "2025-06-16 17:13:21.336110", "rc": 0, "start": "2025-06-16 17:13:21.333727" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-3b147b0f-a34f-43bf-a4ef-e30d56b04273 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Monday 16 June 2025 17:13:21 -0400 (0:00:01.021) 0:06:28.079 *********** ok: [managed-node11] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002382", "end": "2025-06-16 17:13:22.099279", "failed_when_result": false, "rc": 0, "start": "2025-06-16 17:13:22.096897" } STDOUT: luks-3b147b0f-a34f-43bf-a4ef-e30d56b04273 /dev/sda - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Monday 16 June 2025 17:13:22 -0400 (0:00:00.815) 0:06:28.895 *********** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Monday 16 June 2025 17:13:22 -0400 (0:00:00.112) 0:06:29.008 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node11 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Monday 16 June 2025 17:13:22 -0400 (0:00:00.285) 0:06:29.293 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Monday 16 June 2025 17:13:22 -0400 (0:00:00.230) 0:06:29.524 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node11 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Monday 16 June 2025 17:13:23 -0400 (0:00:00.612) 0:06:30.136 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-3b147b0f-a34f-43bf-a4ef-e30d56b04273" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Monday 16 June 2025 17:13:23 -0400 (0:00:00.197) 0:06:30.334 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Monday 16 June 2025 17:13:23 -0400 (0:00:00.196) 0:06:30.530 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Monday 16 June 2025 17:13:24 -0400 (0:00:00.134) 0:06:30.665 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Monday 16 June 2025 17:13:24 -0400 (0:00:00.216) 0:06:30.882 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Monday 16 June 2025 17:13:24 -0400 (0:00:00.204) 0:06:31.087 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Monday 16 June 2025 17:13:24 -0400 (0:00:00.256) 0:06:31.343 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Monday 16 June 2025 17:13:24 -0400 (0:00:00.267) 0:06:31.610 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Monday 16 June 2025 17:13:25 -0400 (0:00:00.112) 0:06:31.723 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Monday 16 June 2025 17:13:25 -0400 (0:00:00.156) 0:06:31.880 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Monday 16 June 2025 17:13:25 -0400 (0:00:00.210) 0:06:32.090 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Monday 16 June 2025 17:13:25 -0400 (0:00:00.168) 0:06:32.258 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-3b147b0f-a34f-43bf-a4ef-e30d56b04273 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Monday 16 June 2025 17:13:26 -0400 (0:00:00.478) 0:06:32.737 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Monday 16 June 2025 17:13:26 -0400 (0:00:00.270) 0:06:33.007 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Monday 16 June 2025 17:13:26 -0400 (0:00:00.558) 0:06:33.566 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Monday 16 June 2025 17:13:27 -0400 (0:00:00.225) 0:06:33.791 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Monday 16 June 2025 17:13:27 -0400 (0:00:00.337) 0:06:34.129 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Monday 16 June 2025 17:13:27 -0400 (0:00:00.202) 0:06:34.331 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Monday 16 June 2025 17:13:28 -0400 (0:00:00.360) 0:06:34.692 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Monday 16 June 2025 17:13:28 -0400 (0:00:00.262) 0:06:34.954 *********** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1750108381.833928, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1750108381.833928, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 36924, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1750108381.833928, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Monday 16 June 2025 17:13:29 -0400 (0:00:01.270) 0:06:36.225 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Monday 16 June 2025 17:13:29 -0400 (0:00:00.210) 0:06:36.436 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Monday 16 June 2025 17:13:29 -0400 (0:00:00.188) 0:06:36.624 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Monday 16 June 2025 17:13:30 -0400 (0:00:00.212) 0:06:36.836 *********** ok: [managed-node11] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Monday 16 June 2025 17:13:30 -0400 (0:00:00.205) 0:06:37.042 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Monday 16 June 2025 17:13:30 -0400 (0:00:00.181) 0:06:37.224 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Monday 16 June 2025 17:13:30 -0400 (0:00:00.248) 0:06:37.472 *********** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1750108381.9819293, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1750108381.9819293, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 171201, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1750108381.9819293, "nlink": 1, "path": "/dev/mapper/luks-3b147b0f-a34f-43bf-a4ef-e30d56b04273", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Monday 16 June 2025 17:13:31 -0400 (0:00:01.094) 0:06:38.566 *********** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Monday 16 June 2025 17:13:36 -0400 (0:00:04.507) 0:06:43.074 *********** ok: [managed-node11] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda" ], "delta": "0:00:00.010059", "end": "2025-06-16 17:13:37.557141", "rc": 0, "start": "2025-06-16 17:13:37.547082" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 3b147b0f-a34f-43bf-a4ef-e30d56b04273 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 930270 Threads: 2 Salt: 8d 18 0e f5 2b 64 ab a0 bb 71 ae d6 f8 a8 b7 3a 55 e1 4b 0e 0d 53 58 4f 35 e1 e2 95 ca cb 6c 6b AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 120470 Salt: 3d 96 b6 6b 6d 6e b5 06 30 1a 19 f9 a0 97 5b de 5a 6b 27 b3 be e6 ea f3 ae 7a 96 d1 d6 75 31 43 Digest: 87 37 5f ad ae 3f 3c da 3e 53 fc 0e 9c 1b 21 2d d5 98 22 93 27 c3 23 07 3a 07 67 ca dd 2e 4b 52 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Monday 16 June 2025 17:13:37 -0400 (0:00:01.442) 0:06:44.517 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Monday 16 June 2025 17:13:38 -0400 (0:00:00.373) 0:06:44.890 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Monday 16 June 2025 17:13:38 -0400 (0:00:00.234) 0:06:45.124 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Monday 16 June 2025 17:13:38 -0400 (0:00:00.357) 0:06:45.482 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Monday 16 June 2025 17:13:39 -0400 (0:00:00.305) 0:06:45.788 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Monday 16 June 2025 17:13:39 -0400 (0:00:00.370) 0:06:46.158 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Monday 16 June 2025 17:13:39 -0400 (0:00:00.231) 0:06:46.390 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Monday 16 June 2025 17:13:39 -0400 (0:00:00.197) 0:06:46.588 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-3b147b0f-a34f-43bf-a4ef-e30d56b04273 /dev/sda -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Monday 16 June 2025 17:13:40 -0400 (0:00:00.323) 0:06:46.911 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Monday 16 June 2025 17:13:40 -0400 (0:00:00.234) 0:06:47.146 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Monday 16 June 2025 17:13:40 -0400 (0:00:00.377) 0:06:47.523 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Monday 16 June 2025 17:13:41 -0400 (0:00:00.296) 0:06:47.820 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Monday 16 June 2025 17:13:41 -0400 (0:00:00.184) 0:06:48.004 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Monday 16 June 2025 17:13:41 -0400 (0:00:00.133) 0:06:48.137 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Monday 16 June 2025 17:13:41 -0400 (0:00:00.244) 0:06:48.382 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Monday 16 June 2025 17:13:41 -0400 (0:00:00.171) 0:06:48.554 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Monday 16 June 2025 17:13:42 -0400 (0:00:00.242) 0:06:48.797 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Monday 16 June 2025 17:13:42 -0400 (0:00:00.175) 0:06:48.972 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Monday 16 June 2025 17:13:42 -0400 (0:00:00.234) 0:06:49.206 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Monday 16 June 2025 17:13:42 -0400 (0:00:00.127) 0:06:49.334 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Monday 16 June 2025 17:13:42 -0400 (0:00:00.099) 0:06:49.434 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Monday 16 June 2025 17:13:42 -0400 (0:00:00.157) 0:06:49.592 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Monday 16 June 2025 17:13:43 -0400 (0:00:00.129) 0:06:49.721 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Monday 16 June 2025 17:13:43 -0400 (0:00:00.175) 0:06:49.897 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Monday 16 June 2025 17:13:43 -0400 (0:00:00.156) 0:06:50.053 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Monday 16 June 2025 17:13:43 -0400 (0:00:00.218) 0:06:50.272 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Monday 16 June 2025 17:13:43 -0400 (0:00:00.266) 0:06:50.539 *********** ok: [managed-node11] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Monday 16 June 2025 17:13:44 -0400 (0:00:00.297) 0:06:50.836 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Monday 16 June 2025 17:13:44 -0400 (0:00:00.265) 0:06:51.102 *********** skipping: [managed-node11] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Monday 16 June 2025 17:13:44 -0400 (0:00:00.226) 0:06:51.329 *********** skipping: [managed-node11] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Monday 16 June 2025 17:13:44 -0400 (0:00:00.198) 0:06:51.527 *********** skipping: [managed-node11] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Monday 16 June 2025 17:13:45 -0400 (0:00:00.166) 0:06:51.693 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Monday 16 June 2025 17:13:45 -0400 (0:00:00.196) 0:06:51.890 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Monday 16 June 2025 17:13:45 -0400 (0:00:00.122) 0:06:52.013 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Monday 16 June 2025 17:13:45 -0400 (0:00:00.190) 0:06:52.203 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Monday 16 June 2025 17:13:45 -0400 (0:00:00.071) 0:06:52.275 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Monday 16 June 2025 17:13:45 -0400 (0:00:00.180) 0:06:52.455 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Monday 16 June 2025 17:13:46 -0400 (0:00:00.326) 0:06:52.782 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Monday 16 June 2025 17:13:46 -0400 (0:00:00.293) 0:06:53.075 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Monday 16 June 2025 17:13:46 -0400 (0:00:00.180) 0:06:53.256 *********** skipping: [managed-node11] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Monday 16 June 2025 17:13:46 -0400 (0:00:00.159) 0:06:53.415 *********** skipping: [managed-node11] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Monday 16 June 2025 17:13:46 -0400 (0:00:00.202) 0:06:53.618 *********** skipping: [managed-node11] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Monday 16 June 2025 17:13:47 -0400 (0:00:00.163) 0:06:53.782 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Monday 16 June 2025 17:13:47 -0400 (0:00:00.205) 0:06:53.987 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Monday 16 June 2025 17:13:47 -0400 (0:00:00.198) 0:06:54.185 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Monday 16 June 2025 17:13:47 -0400 (0:00:00.191) 0:06:54.377 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Monday 16 June 2025 17:13:47 -0400 (0:00:00.226) 0:06:54.603 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Monday 16 June 2025 17:13:48 -0400 (0:00:00.165) 0:06:54.768 *********** ok: [managed-node11] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Monday 16 June 2025 17:13:48 -0400 (0:00:00.206) 0:06:54.975 *********** ok: [managed-node11] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Monday 16 June 2025 17:13:48 -0400 (0:00:00.204) 0:06:55.179 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Monday 16 June 2025 17:13:48 -0400 (0:00:00.236) 0:06:55.416 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Monday 16 June 2025 17:13:49 -0400 (0:00:00.491) 0:06:55.907 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Monday 16 June 2025 17:13:49 -0400 (0:00:00.227) 0:06:56.135 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Monday 16 June 2025 17:13:49 -0400 (0:00:00.231) 0:06:56.367 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Monday 16 June 2025 17:13:50 -0400 (0:00:00.268) 0:06:56.636 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Monday 16 June 2025 17:13:50 -0400 (0:00:00.230) 0:06:56.866 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Monday 16 June 2025 17:13:50 -0400 (0:00:00.348) 0:06:57.214 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Monday 16 June 2025 17:13:50 -0400 (0:00:00.358) 0:06:57.572 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Monday 16 June 2025 17:13:51 -0400 (0:00:00.214) 0:06:57.786 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Test for correct handling of new encrypted volume w/ no key] ************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:203 Monday 16 June 2025 17:13:51 -0400 (0:00:00.190) 0:06:57.977 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node11 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Monday 16 June 2025 17:13:51 -0400 (0:00:00.605) 0:06:58.582 *********** ok: [managed-node11] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Monday 16 June 2025 17:13:52 -0400 (0:00:00.287) 0:06:58.870 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 16 June 2025 17:13:52 -0400 (0:00:00.327) 0:06:59.197 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 16 June 2025 17:13:52 -0400 (0:00:00.399) 0:06:59.597 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 16 June 2025 17:13:53 -0400 (0:00:00.272) 0:06:59.869 *********** skipping: [managed-node11] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node11] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Monday 16 June 2025 17:13:53 -0400 (0:00:00.548) 0:07:00.417 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Monday 16 June 2025 17:13:54 -0400 (0:00:00.295) 0:07:00.713 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 16 June 2025 17:13:54 -0400 (0:00:00.263) 0:07:00.977 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 16 June 2025 17:13:54 -0400 (0:00:00.159) 0:07:01.137 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 16 June 2025 17:13:54 -0400 (0:00:00.317) 0:07:01.455 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Monday 16 June 2025 17:13:55 -0400 (0:00:00.651) 0:07:02.106 *********** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 16 June 2025 17:14:00 -0400 (0:00:04.687) 0:07:06.794 *********** ok: [managed-node11] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_luks_version": "luks2", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Monday 16 June 2025 17:14:00 -0400 (0:00:00.173) 0:07:06.967 *********** ok: [managed-node11] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Monday 16 June 2025 17:14:00 -0400 (0:00:00.144) 0:07:07.112 *********** ok: [managed-node11] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 16 June 2025 17:14:05 -0400 (0:00:05.175) 0:07:12.288 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 16 June 2025 17:14:06 -0400 (0:00:00.352) 0:07:12.640 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 16 June 2025 17:14:06 -0400 (0:00:00.176) 0:07:12.817 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 16 June 2025 17:14:06 -0400 (0:00:00.279) 0:07:13.097 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Monday 16 June 2025 17:14:06 -0400 (0:00:00.288) 0:07:13.385 *********** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Monday 16 June 2025 17:14:11 -0400 (0:00:04.596) 0:07:17.981 *********** ok: [managed-node11] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Monday 16 June 2025 17:14:14 -0400 (0:00:03.259) 0:07:21.241 *********** ok: [managed-node11] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Monday 16 June 2025 17:14:14 -0400 (0:00:00.346) 0:07:21.587 *********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Monday 16 June 2025 17:14:15 -0400 (0:00:00.250) 0:07:21.838 *********** fatal: [managed-node11]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'test1' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Monday 16 June 2025 17:14:20 -0400 (0:00:05.702) 0:07:27.540 *********** fatal: [managed-node11]: FAILED! => { "changed": false } MSG: {'msg': "encrypted volume 'test1' missing key/password", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': False, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 16 June 2025 17:14:21 -0400 (0:00:00.317) 0:07:27.858 *********** TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Monday 16 June 2025 17:14:21 -0400 (0:00:00.107) 0:07:27.966 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Monday 16 June 2025 17:14:21 -0400 (0:00:00.121) 0:07:28.088 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Monday 16 June 2025 17:14:21 -0400 (0:00:00.125) 0:07:28.214 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create an encrypted partition volume w/ default fs] ********************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:223 Monday 16 June 2025 17:14:21 -0400 (0:00:00.081) 0:07:28.295 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 16 June 2025 17:14:22 -0400 (0:00:00.828) 0:07:29.124 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 16 June 2025 17:14:22 -0400 (0:00:00.278) 0:07:29.402 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 16 June 2025 17:14:23 -0400 (0:00:00.231) 0:07:29.633 *********** skipping: [managed-node11] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node11] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Monday 16 June 2025 17:14:23 -0400 (0:00:00.531) 0:07:30.165 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Monday 16 June 2025 17:14:23 -0400 (0:00:00.176) 0:07:30.341 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 16 June 2025 17:14:23 -0400 (0:00:00.198) 0:07:30.540 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 16 June 2025 17:14:24 -0400 (0:00:00.168) 0:07:30.708 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 16 June 2025 17:14:24 -0400 (0:00:00.163) 0:07:30.872 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Monday 16 June 2025 17:14:24 -0400 (0:00:00.545) 0:07:31.418 *********** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 16 June 2025 17:14:29 -0400 (0:00:04.731) 0:07:36.149 *********** ok: [managed-node11] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Monday 16 June 2025 17:14:29 -0400 (0:00:00.287) 0:07:36.437 *********** ok: [managed-node11] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Monday 16 June 2025 17:14:30 -0400 (0:00:00.303) 0:07:36.741 *********** ok: [managed-node11] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 16 June 2025 17:14:35 -0400 (0:00:05.329) 0:07:42.070 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 16 June 2025 17:14:35 -0400 (0:00:00.281) 0:07:42.351 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 16 June 2025 17:14:35 -0400 (0:00:00.130) 0:07:42.481 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 16 June 2025 17:14:35 -0400 (0:00:00.094) 0:07:42.575 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Monday 16 June 2025 17:14:36 -0400 (0:00:00.081) 0:07:42.657 *********** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Monday 16 June 2025 17:14:40 -0400 (0:00:04.514) 0:07:47.172 *********** ok: [managed-node11] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Monday 16 June 2025 17:14:43 -0400 (0:00:02.812) 0:07:49.984 *********** ok: [managed-node11] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Monday 16 June 2025 17:14:43 -0400 (0:00:00.460) 0:07:50.445 *********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Monday 16 June 2025 17:14:44 -0400 (0:00:00.267) 0:07:50.712 *********** changed: [managed-node11] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-3b147b0f-a34f-43bf-a4ef-e30d56b04273", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-3b147b0f-a34f-43bf-a4ef-e30d56b04273", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-676decb8-5a23-4d42-ba5c-1b3b350e8738", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-676decb8-5a23-4d42-ba5c-1b3b350e8738", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-3b147b0f-a34f-43bf-a4ef-e30d56b04273", "password": "-", "state": "absent" }, { "backing_device": "/dev/sda1", "name": "luks-676decb8-5a23-4d42-ba5c-1b3b350e8738", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-676decb8-5a23-4d42-ba5c-1b3b350e8738" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-3b147b0f-a34f-43bf-a4ef-e30d56b04273", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-676decb8-5a23-4d42-ba5c-1b3b350e8738", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-676decb8-5a23-4d42-ba5c-1b3b350e8738", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-676decb8-5a23-4d42-ba5c-1b3b350e8738", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Monday 16 June 2025 17:14:58 -0400 (0:00:14.117) 0:08:04.830 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Monday 16 June 2025 17:14:58 -0400 (0:00:00.140) 0:08:04.970 *********** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1750108390.8550005, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "a6d05f1ca722bb14907719d888a012a5b4c8079a", "ctime": 1750108390.8510003, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 364904586, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1750108390.8510003, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "4217545767", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Monday 16 June 2025 17:14:59 -0400 (0:00:01.504) 0:08:06.475 *********** ok: [managed-node11] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 16 June 2025 17:15:01 -0400 (0:00:01.637) 0:08:08.112 *********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Monday 16 June 2025 17:15:01 -0400 (0:00:00.250) 0:08:08.363 *********** ok: [managed-node11] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-3b147b0f-a34f-43bf-a4ef-e30d56b04273", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-3b147b0f-a34f-43bf-a4ef-e30d56b04273", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-676decb8-5a23-4d42-ba5c-1b3b350e8738", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-676decb8-5a23-4d42-ba5c-1b3b350e8738", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-3b147b0f-a34f-43bf-a4ef-e30d56b04273", "password": "-", "state": "absent" }, { "backing_device": "/dev/sda1", "name": "luks-676decb8-5a23-4d42-ba5c-1b3b350e8738", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-676decb8-5a23-4d42-ba5c-1b3b350e8738" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-3b147b0f-a34f-43bf-a4ef-e30d56b04273", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-676decb8-5a23-4d42-ba5c-1b3b350e8738", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-676decb8-5a23-4d42-ba5c-1b3b350e8738", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-676decb8-5a23-4d42-ba5c-1b3b350e8738", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Monday 16 June 2025 17:15:02 -0400 (0:00:00.290) 0:08:08.654 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-676decb8-5a23-4d42-ba5c-1b3b350e8738", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-676decb8-5a23-4d42-ba5c-1b3b350e8738", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Monday 16 June 2025 17:15:02 -0400 (0:00:00.316) 0:08:08.970 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Monday 16 June 2025 17:15:02 -0400 (0:00:00.297) 0:08:09.268 *********** changed: [managed-node11] => (item={'src': '/dev/mapper/luks-3b147b0f-a34f-43bf-a4ef-e30d56b04273', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-3b147b0f-a34f-43bf-a4ef-e30d56b04273", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-3b147b0f-a34f-43bf-a4ef-e30d56b04273" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Monday 16 June 2025 17:15:04 -0400 (0:00:01.559) 0:08:10.827 *********** ok: [managed-node11] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Monday 16 June 2025 17:15:05 -0400 (0:00:01.787) 0:08:12.614 *********** changed: [managed-node11] => (item={'src': '/dev/mapper/luks-676decb8-5a23-4d42-ba5c-1b3b350e8738', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-676decb8-5a23-4d42-ba5c-1b3b350e8738", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-676decb8-5a23-4d42-ba5c-1b3b350e8738" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Monday 16 June 2025 17:15:07 -0400 (0:00:01.339) 0:08:13.954 *********** skipping: [managed-node11] => (item={'src': '/dev/mapper/luks-676decb8-5a23-4d42-ba5c-1b3b350e8738', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-676decb8-5a23-4d42-ba5c-1b3b350e8738", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Monday 16 June 2025 17:15:07 -0400 (0:00:00.278) 0:08:14.232 *********** ok: [managed-node11] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Monday 16 June 2025 17:15:09 -0400 (0:00:01.681) 0:08:15.913 *********** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1750108402.0980906, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "d7ce732a296bc3dec07ce549cf0bf1be1d8ea6e8", "ctime": 1750108396.151043, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 465567875, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1750108396.1500428, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 53, "uid": 0, "version": "946880956", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Monday 16 June 2025 17:15:10 -0400 (0:00:01.449) 0:08:17.363 *********** changed: [managed-node11] => (item={'backing_device': '/dev/sda', 'name': 'luks-3b147b0f-a34f-43bf-a4ef-e30d56b04273', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-3b147b0f-a34f-43bf-a4ef-e30d56b04273", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed changed: [managed-node11] => (item={'backing_device': '/dev/sda1', 'name': 'luks-676decb8-5a23-4d42-ba5c-1b3b350e8738', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-676decb8-5a23-4d42-ba5c-1b3b350e8738", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Monday 16 June 2025 17:15:13 -0400 (0:00:03.138) 0:08:20.502 *********** ok: [managed-node11] TASK [Verify role results] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:241 Monday 16 June 2025 17:15:16 -0400 (0:00:02.317) 0:08:22.820 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node11 TASK [Print out pool information] ********************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Monday 16 June 2025 17:15:16 -0400 (0:00:00.520) 0:08:23.340 *********** ok: [managed-node11] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-676decb8-5a23-4d42-ba5c-1b3b350e8738", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-676decb8-5a23-4d42-ba5c-1b3b350e8738", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Monday 16 June 2025 17:15:16 -0400 (0:00:00.115) 0:08:23.456 *********** skipping: [managed-node11] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Monday 16 June 2025 17:15:16 -0400 (0:00:00.039) 0:08:23.496 *********** ok: [managed-node11] => { "changed": false, "info": { "/dev/mapper/luks-676decb8-5a23-4d42-ba5c-1b3b350e8738": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-676decb8-5a23-4d42-ba5c-1b3b350e8738", "size": "10G", "type": "crypt", "uuid": "edb5f52e-3f8f-4b50-b2be-28f1896ee91c" }, "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "676decb8-5a23-4d42-ba5c-1b3b350e8738" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Monday 16 June 2025 17:15:17 -0400 (0:00:01.076) 0:08:24.573 *********** ok: [managed-node11] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002629", "end": "2025-06-16 17:15:19.273959", "rc": 0, "start": "2025-06-16 17:15:19.271330" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-676decb8-5a23-4d42-ba5c-1b3b350e8738 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Monday 16 June 2025 17:15:19 -0400 (0:00:01.594) 0:08:26.167 *********** ok: [managed-node11] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002703", "end": "2025-06-16 17:15:20.719550", "failed_when_result": false, "rc": 0, "start": "2025-06-16 17:15:20.716847" } STDOUT: luks-676decb8-5a23-4d42-ba5c-1b3b350e8738 /dev/sda1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Monday 16 June 2025 17:15:21 -0400 (0:00:01.490) 0:08:27.658 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node11 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Monday 16 June 2025 17:15:21 -0400 (0:00:00.421) 0:08:28.079 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Monday 16 June 2025 17:15:21 -0400 (0:00:00.269) 0:08:28.348 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Monday 16 June 2025 17:15:21 -0400 (0:00:00.247) 0:08:28.596 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Monday 16 June 2025 17:15:22 -0400 (0:00:00.217) 0:08:28.814 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node11 TASK [Set test variables] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Monday 16 June 2025 17:15:22 -0400 (0:00:00.521) 0:08:29.335 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Monday 16 June 2025 17:15:23 -0400 (0:00:00.318) 0:08:29.653 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Monday 16 June 2025 17:15:23 -0400 (0:00:00.257) 0:08:29.911 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Monday 16 June 2025 17:15:23 -0400 (0:00:00.239) 0:08:30.151 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Monday 16 June 2025 17:15:23 -0400 (0:00:00.241) 0:08:30.392 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Monday 16 June 2025 17:15:23 -0400 (0:00:00.208) 0:08:30.601 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Monday 16 June 2025 17:15:24 -0400 (0:00:00.217) 0:08:30.818 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Monday 16 June 2025 17:15:24 -0400 (0:00:00.234) 0:08:31.052 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:54 Monday 16 June 2025 17:15:24 -0400 (0:00:00.158) 0:08:31.211 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:67 Monday 16 June 2025 17:15:24 -0400 (0:00:00.294) 0:08:31.505 *********** ok: [managed-node11] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.14.162 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:77 Monday 16 June 2025 17:15:26 -0400 (0:00:01.566) 0:08:33.071 *********** TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:87 Monday 16 June 2025 17:15:26 -0400 (0:00:00.215) 0:08:33.287 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node11 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Monday 16 June 2025 17:15:27 -0400 (0:00:00.478) 0:08:33.766 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Monday 16 June 2025 17:15:27 -0400 (0:00:00.291) 0:08:34.058 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Monday 16 June 2025 17:15:27 -0400 (0:00:00.292) 0:08:34.351 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Monday 16 June 2025 17:15:28 -0400 (0:00:00.296) 0:08:34.647 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Monday 16 June 2025 17:15:28 -0400 (0:00:00.708) 0:08:35.355 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Monday 16 June 2025 17:15:29 -0400 (0:00:00.283) 0:08:35.638 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Monday 16 June 2025 17:15:29 -0400 (0:00:00.305) 0:08:35.944 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Monday 16 June 2025 17:15:29 -0400 (0:00:00.332) 0:08:36.276 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Monday 16 June 2025 17:15:29 -0400 (0:00:00.256) 0:08:36.533 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Monday 16 June 2025 17:15:30 -0400 (0:00:00.289) 0:08:36.823 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Monday 16 June 2025 17:15:30 -0400 (0:00:00.236) 0:08:37.059 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:90 Monday 16 June 2025 17:15:30 -0400 (0:00:00.199) 0:08:37.259 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node11 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Monday 16 June 2025 17:15:31 -0400 (0:00:00.676) 0:08:37.935 *********** skipping: [managed-node11] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-676decb8-5a23-4d42-ba5c-1b3b350e8738', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-676decb8-5a23-4d42-ba5c-1b3b350e8738', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/mapper/luks-676decb8-5a23-4d42-ba5c-1b3b350e8738", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-676decb8-5a23-4d42-ba5c-1b3b350e8738", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:93 Monday 16 June 2025 17:15:31 -0400 (0:00:00.247) 0:08:38.183 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node11 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Monday 16 June 2025 17:15:32 -0400 (0:00:00.529) 0:08:38.713 *********** skipping: [managed-node11] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-676decb8-5a23-4d42-ba5c-1b3b350e8738', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-676decb8-5a23-4d42-ba5c-1b3b350e8738', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/mapper/luks-676decb8-5a23-4d42-ba5c-1b3b350e8738", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-676decb8-5a23-4d42-ba5c-1b3b350e8738", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:96 Monday 16 June 2025 17:15:32 -0400 (0:00:00.383) 0:08:39.097 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node11 TASK [Set test variables] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Monday 16 June 2025 17:15:33 -0400 (0:00:00.542) 0:08:39.639 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Monday 16 June 2025 17:15:33 -0400 (0:00:00.375) 0:08:40.015 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Monday 16 June 2025 17:15:33 -0400 (0:00:00.263) 0:08:40.278 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Monday 16 June 2025 17:15:33 -0400 (0:00:00.251) 0:08:40.530 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:99 Monday 16 June 2025 17:15:34 -0400 (0:00:00.207) 0:08:40.737 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node11 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Monday 16 June 2025 17:15:34 -0400 (0:00:00.649) 0:08:41.387 *********** skipping: [managed-node11] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-676decb8-5a23-4d42-ba5c-1b3b350e8738', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-676decb8-5a23-4d42-ba5c-1b3b350e8738', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/mapper/luks-676decb8-5a23-4d42-ba5c-1b3b350e8738", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-676decb8-5a23-4d42-ba5c-1b3b350e8738", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:102 Monday 16 June 2025 17:15:35 -0400 (0:00:00.296) 0:08:41.683 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node11 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Monday 16 June 2025 17:15:35 -0400 (0:00:00.652) 0:08:42.336 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Monday 16 June 2025 17:15:36 -0400 (0:00:00.320) 0:08:42.657 *********** skipping: [managed-node11] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Monday 16 June 2025 17:15:36 -0400 (0:00:00.182) 0:08:42.840 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Monday 16 June 2025 17:15:36 -0400 (0:00:00.286) 0:08:43.127 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Monday 16 June 2025 17:15:36 -0400 (0:00:00.215) 0:08:43.342 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Monday 16 June 2025 17:15:37 -0400 (0:00:00.319) 0:08:43.661 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Monday 16 June 2025 17:15:37 -0400 (0:00:00.299) 0:08:43.960 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:105 Monday 16 June 2025 17:15:37 -0400 (0:00:00.222) 0:08:44.183 *********** ok: [managed-node11] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Monday 16 June 2025 17:15:37 -0400 (0:00:00.170) 0:08:44.354 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node11 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Monday 16 June 2025 17:15:38 -0400 (0:00:00.440) 0:08:44.795 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Monday 16 June 2025 17:15:38 -0400 (0:00:00.298) 0:08:45.093 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node11 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Monday 16 June 2025 17:15:39 -0400 (0:00:01.293) 0:08:46.387 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-676decb8-5a23-4d42-ba5c-1b3b350e8738" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Monday 16 June 2025 17:15:39 -0400 (0:00:00.236) 0:08:46.623 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Monday 16 June 2025 17:15:40 -0400 (0:00:00.361) 0:08:46.985 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Monday 16 June 2025 17:15:40 -0400 (0:00:00.369) 0:08:47.354 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Monday 16 June 2025 17:15:41 -0400 (0:00:00.301) 0:08:47.656 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Monday 16 June 2025 17:15:41 -0400 (0:00:00.643) 0:08:48.300 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Monday 16 June 2025 17:15:41 -0400 (0:00:00.287) 0:08:48.587 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Monday 16 June 2025 17:15:42 -0400 (0:00:00.335) 0:08:48.923 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Monday 16 June 2025 17:15:42 -0400 (0:00:00.268) 0:08:49.191 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Monday 16 June 2025 17:15:42 -0400 (0:00:00.260) 0:08:49.452 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Monday 16 June 2025 17:15:43 -0400 (0:00:00.306) 0:08:49.759 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Monday 16 June 2025 17:15:43 -0400 (0:00:00.267) 0:08:50.026 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-676decb8-5a23-4d42-ba5c-1b3b350e8738 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Monday 16 June 2025 17:15:43 -0400 (0:00:00.399) 0:08:50.425 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Monday 16 June 2025 17:15:44 -0400 (0:00:00.299) 0:08:50.725 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Monday 16 June 2025 17:15:44 -0400 (0:00:00.364) 0:08:51.090 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Monday 16 June 2025 17:15:44 -0400 (0:00:00.262) 0:08:51.352 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Monday 16 June 2025 17:15:44 -0400 (0:00:00.267) 0:08:51.620 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Monday 16 June 2025 17:15:45 -0400 (0:00:00.233) 0:08:51.853 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Monday 16 June 2025 17:15:45 -0400 (0:00:00.305) 0:08:52.159 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Monday 16 June 2025 17:15:45 -0400 (0:00:00.283) 0:08:52.442 *********** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1750108497.8028576, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1750108497.8028576, "dev": 6, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 182953, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1750108497.8028576, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Monday 16 June 2025 17:15:47 -0400 (0:00:01.457) 0:08:53.899 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Monday 16 June 2025 17:15:47 -0400 (0:00:00.191) 0:08:54.091 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Monday 16 June 2025 17:15:47 -0400 (0:00:00.103) 0:08:54.194 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Monday 16 June 2025 17:15:47 -0400 (0:00:00.179) 0:08:54.374 *********** ok: [managed-node11] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Monday 16 June 2025 17:15:47 -0400 (0:00:00.104) 0:08:54.478 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Monday 16 June 2025 17:15:48 -0400 (0:00:00.236) 0:08:54.715 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Monday 16 June 2025 17:15:48 -0400 (0:00:00.305) 0:08:55.020 *********** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1750108497.9538589, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1750108497.9538589, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 184057, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1750108497.9538589, "nlink": 1, "path": "/dev/mapper/luks-676decb8-5a23-4d42-ba5c-1b3b350e8738", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Monday 16 June 2025 17:15:50 -0400 (0:00:01.894) 0:08:56.914 *********** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Monday 16 June 2025 17:15:54 -0400 (0:00:04.701) 0:09:01.615 *********** ok: [managed-node11] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda1" ], "delta": "0:00:00.009968", "end": "2025-06-16 17:15:56.100695", "rc": 0, "start": "2025-06-16 17:15:56.090727" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 676decb8-5a23-4d42-ba5c-1b3b350e8738 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 930270 Threads: 2 Salt: 8a 76 1d 55 2d 52 86 1e 17 db 71 b4 73 78 f3 e5 04 d8 e7 8b 12 cd fd 89 c6 2e 1a 70 77 25 9c 74 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 120470 Salt: 50 b1 ec 59 2a be 13 71 10 6a 5f 8a 64 2e 94 af 3e 43 3e e5 6f b8 7f c8 a9 43 50 72 7d 9f 5b c8 Digest: 2c 54 58 7d c4 9c d4 70 e6 0b a2 0e af 8e 63 41 da 90 54 3d 35 8e 4a 8a d6 70 ad 40 e0 ae 6c 66 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Monday 16 June 2025 17:15:56 -0400 (0:00:01.335) 0:09:02.951 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Monday 16 June 2025 17:15:56 -0400 (0:00:00.140) 0:09:03.092 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Monday 16 June 2025 17:15:56 -0400 (0:00:00.340) 0:09:03.432 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Monday 16 June 2025 17:15:57 -0400 (0:00:00.225) 0:09:03.657 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Monday 16 June 2025 17:15:57 -0400 (0:00:00.246) 0:09:03.904 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Monday 16 June 2025 17:15:57 -0400 (0:00:00.305) 0:09:04.209 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Monday 16 June 2025 17:15:57 -0400 (0:00:00.254) 0:09:04.464 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Monday 16 June 2025 17:15:58 -0400 (0:00:00.208) 0:09:04.672 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-676decb8-5a23-4d42-ba5c-1b3b350e8738 /dev/sda1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Monday 16 June 2025 17:15:58 -0400 (0:00:00.177) 0:09:04.850 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Monday 16 June 2025 17:15:58 -0400 (0:00:00.207) 0:09:05.057 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Monday 16 June 2025 17:15:58 -0400 (0:00:00.222) 0:09:05.280 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Monday 16 June 2025 17:15:58 -0400 (0:00:00.199) 0:09:05.479 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Monday 16 June 2025 17:15:59 -0400 (0:00:00.279) 0:09:05.758 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Monday 16 June 2025 17:15:59 -0400 (0:00:00.178) 0:09:05.937 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Monday 16 June 2025 17:15:59 -0400 (0:00:00.201) 0:09:06.138 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Monday 16 June 2025 17:15:59 -0400 (0:00:00.218) 0:09:06.357 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Monday 16 June 2025 17:15:59 -0400 (0:00:00.216) 0:09:06.573 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Monday 16 June 2025 17:16:00 -0400 (0:00:00.239) 0:09:06.812 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Monday 16 June 2025 17:16:00 -0400 (0:00:00.281) 0:09:07.094 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Monday 16 June 2025 17:16:00 -0400 (0:00:00.310) 0:09:07.405 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Monday 16 June 2025 17:16:01 -0400 (0:00:00.260) 0:09:07.665 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Monday 16 June 2025 17:16:01 -0400 (0:00:00.315) 0:09:07.981 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Monday 16 June 2025 17:16:01 -0400 (0:00:00.316) 0:09:08.297 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Monday 16 June 2025 17:16:01 -0400 (0:00:00.322) 0:09:08.619 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Monday 16 June 2025 17:16:02 -0400 (0:00:00.274) 0:09:08.894 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Monday 16 June 2025 17:16:02 -0400 (0:00:00.331) 0:09:09.225 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Monday 16 June 2025 17:16:02 -0400 (0:00:00.379) 0:09:09.605 *********** ok: [managed-node11] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Monday 16 June 2025 17:16:03 -0400 (0:00:00.192) 0:09:09.797 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Monday 16 June 2025 17:16:03 -0400 (0:00:00.109) 0:09:09.907 *********** skipping: [managed-node11] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Monday 16 June 2025 17:16:03 -0400 (0:00:00.335) 0:09:10.242 *********** skipping: [managed-node11] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Monday 16 June 2025 17:16:03 -0400 (0:00:00.332) 0:09:10.575 *********** skipping: [managed-node11] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Monday 16 June 2025 17:16:04 -0400 (0:00:00.298) 0:09:10.873 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Monday 16 June 2025 17:16:04 -0400 (0:00:00.353) 0:09:11.227 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Monday 16 June 2025 17:16:04 -0400 (0:00:00.337) 0:09:11.564 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Monday 16 June 2025 17:16:05 -0400 (0:00:00.314) 0:09:11.879 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Monday 16 June 2025 17:16:05 -0400 (0:00:00.302) 0:09:12.181 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Monday 16 June 2025 17:16:06 -0400 (0:00:00.641) 0:09:12.822 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Monday 16 June 2025 17:16:06 -0400 (0:00:00.448) 0:09:13.271 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Monday 16 June 2025 17:16:06 -0400 (0:00:00.311) 0:09:13.583 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Monday 16 June 2025 17:16:07 -0400 (0:00:00.380) 0:09:13.963 *********** skipping: [managed-node11] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Monday 16 June 2025 17:16:07 -0400 (0:00:00.370) 0:09:14.334 *********** skipping: [managed-node11] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Monday 16 June 2025 17:16:07 -0400 (0:00:00.229) 0:09:14.563 *********** skipping: [managed-node11] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Monday 16 June 2025 17:16:08 -0400 (0:00:00.329) 0:09:14.892 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Monday 16 June 2025 17:16:08 -0400 (0:00:00.265) 0:09:15.158 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Monday 16 June 2025 17:16:08 -0400 (0:00:00.303) 0:09:15.462 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Monday 16 June 2025 17:16:09 -0400 (0:00:00.418) 0:09:15.880 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Monday 16 June 2025 17:16:09 -0400 (0:00:00.290) 0:09:16.170 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Monday 16 June 2025 17:16:09 -0400 (0:00:00.254) 0:09:16.425 *********** ok: [managed-node11] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Monday 16 June 2025 17:16:10 -0400 (0:00:00.227) 0:09:16.653 *********** ok: [managed-node11] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Monday 16 June 2025 17:16:10 -0400 (0:00:00.239) 0:09:16.892 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Monday 16 June 2025 17:16:10 -0400 (0:00:00.248) 0:09:17.141 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Monday 16 June 2025 17:16:10 -0400 (0:00:00.294) 0:09:17.435 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Monday 16 June 2025 17:16:10 -0400 (0:00:00.190) 0:09:17.625 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Monday 16 June 2025 17:16:11 -0400 (0:00:00.233) 0:09:17.859 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Monday 16 June 2025 17:16:11 -0400 (0:00:00.325) 0:09:18.184 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Monday 16 June 2025 17:16:11 -0400 (0:00:00.184) 0:09:18.368 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Monday 16 June 2025 17:16:12 -0400 (0:00:00.266) 0:09:18.635 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Monday 16 June 2025 17:16:12 -0400 (0:00:00.223) 0:09:18.858 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Monday 16 June 2025 17:16:12 -0400 (0:00:00.257) 0:09:19.116 *********** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Monday 16 June 2025 17:16:12 -0400 (0:00:00.170) 0:09:19.286 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Monday 16 June 2025 17:16:12 -0400 (0:00:00.164) 0:09:19.451 *********** changed: [managed-node11] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:247 Monday 16 June 2025 17:16:14 -0400 (0:00:01.572) 0:09:21.024 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node11 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Monday 16 June 2025 17:16:14 -0400 (0:00:00.561) 0:09:21.586 *********** ok: [managed-node11] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Monday 16 June 2025 17:16:15 -0400 (0:00:00.212) 0:09:21.798 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 16 June 2025 17:16:15 -0400 (0:00:00.343) 0:09:22.142 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 16 June 2025 17:16:15 -0400 (0:00:00.336) 0:09:22.478 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 16 June 2025 17:16:16 -0400 (0:00:00.353) 0:09:22.832 *********** skipping: [managed-node11] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node11] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Monday 16 June 2025 17:16:16 -0400 (0:00:00.604) 0:09:23.436 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Monday 16 June 2025 17:16:17 -0400 (0:00:00.207) 0:09:23.644 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 16 June 2025 17:16:17 -0400 (0:00:00.287) 0:09:23.932 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 16 June 2025 17:16:17 -0400 (0:00:00.183) 0:09:24.115 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 16 June 2025 17:16:17 -0400 (0:00:00.261) 0:09:24.376 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Monday 16 June 2025 17:16:18 -0400 (0:00:00.729) 0:09:25.106 *********** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 16 June 2025 17:16:23 -0400 (0:00:04.759) 0:09:29.866 *********** ok: [managed-node11] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": false, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Monday 16 June 2025 17:16:23 -0400 (0:00:00.288) 0:09:30.154 *********** ok: [managed-node11] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Monday 16 June 2025 17:16:23 -0400 (0:00:00.194) 0:09:30.348 *********** ok: [managed-node11] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 16 June 2025 17:16:29 -0400 (0:00:05.595) 0:09:35.943 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 16 June 2025 17:16:29 -0400 (0:00:00.436) 0:09:36.380 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 16 June 2025 17:16:30 -0400 (0:00:00.295) 0:09:36.675 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 16 June 2025 17:16:30 -0400 (0:00:00.234) 0:09:36.909 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Monday 16 June 2025 17:16:30 -0400 (0:00:00.186) 0:09:37.096 *********** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Monday 16 June 2025 17:16:35 -0400 (0:00:04.746) 0:09:41.843 *********** ok: [managed-node11] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...da34f\\x2d43bf\\x2da4ef\\x2de30d56b04273.service": { "name": "systemd-cryptsetup@luk...da34f\\x2d43bf\\x2da4ef\\x2de30d56b04273.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2d3b147b0f\\x2da34f\\x2d43bf\\x2da4ef\\x2de30d56b04273.service": { "name": "systemd-cryptsetup@luks\\x2d3b147b0f\\x2da34f\\x2d43bf\\x2da4ef\\x2de30d56b04273.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Monday 16 June 2025 17:16:38 -0400 (0:00:02.896) 0:09:44.739 *********** ok: [managed-node11] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d3b147b0f\\x2da34f\\x2d43bf\\x2da4ef\\x2de30d56b04273.service", "systemd-cryptsetup@luk...da34f\\x2d43bf\\x2da4ef\\x2de30d56b04273.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Monday 16 June 2025 17:16:38 -0400 (0:00:00.456) 0:09:45.196 *********** changed: [managed-node11] => (item=systemd-cryptsetup@luks\x2d3b147b0f\x2da34f\x2d43bf\x2da4ef\x2de30d56b04273.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d3b147b0f\\x2da34f\\x2d43bf\\x2da4ef\\x2de30d56b04273.service", "name": "systemd-cryptsetup@luks\\x2d3b147b0f\\x2da34f\\x2d43bf\\x2da4ef\\x2de30d56b04273.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-journald.socket system-systemd\\x2dcryptsetup.slice dev-sda.device cryptsetup-pre.target", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-sda.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-3b147b0f-a34f-43bf-a4ef-e30d56b04273", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-3b147b0f-a34f-43bf-a4ef-e30d56b04273 /dev/sda - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-3b147b0f-a34f-43bf-a4ef-e30d56b04273 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d3b147b0f\\x2da34f\\x2d43bf\\x2da4ef\\x2de30d56b04273.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d3b147b0f\\x2da34f\\x2d43bf\\x2da4ef\\x2de30d56b04273.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d3b147b0f\\x2da34f\\x2d43bf\\x2da4ef\\x2de30d56b04273.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Mon 2025-06-16 17:15:08 EDT", "StateChangeTimestampMonotonic": "2144950597", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-sda.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node11] => (item=systemd-cryptsetup@luk...da34f\x2d43bf\x2da4ef\x2de30d56b04273.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...da34f\\x2d43bf\\x2da4ef\\x2de30d56b04273.service", "name": "systemd-cryptsetup@luk...da34f\\x2d43bf\\x2da4ef\\x2de30d56b04273.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...da34f\\x2d43bf\\x2da4ef\\x2de30d56b04273.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...da34f\\x2d43bf\\x2da4ef\\x2de30d56b04273.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...da34f\\x2d43bf\\x2da4ef\\x2de30d56b04273.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...da34f\\x2d43bf\\x2da4ef\\x2de30d56b04273.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Monday 16 June 2025 17:16:42 -0400 (0:00:03.589) 0:09:48.785 *********** fatal: [managed-node11]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-676decb8-5a23-4d42-ba5c-1b3b350e8738' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Monday 16 June 2025 17:16:47 -0400 (0:00:05.224) 0:09:54.010 *********** fatal: [managed-node11]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'luks-676decb8-5a23-4d42-ba5c-1b3b350e8738' in safe mode due to encryption removal", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 16 June 2025 17:16:47 -0400 (0:00:00.220) 0:09:54.231 *********** changed: [managed-node11] => (item=systemd-cryptsetup@luks\x2d3b147b0f\x2da34f\x2d43bf\x2da4ef\x2de30d56b04273.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d3b147b0f\\x2da34f\\x2d43bf\\x2da4ef\\x2de30d56b04273.service", "name": "systemd-cryptsetup@luks\\x2d3b147b0f\\x2da34f\\x2d43bf\\x2da4ef\\x2de30d56b04273.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d3b147b0f\\x2da34f\\x2d43bf\\x2da4ef\\x2de30d56b04273.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d3b147b0f\\x2da34f\\x2d43bf\\x2da4ef\\x2de30d56b04273.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d3b147b0f\\x2da34f\\x2d43bf\\x2da4ef\\x2de30d56b04273.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d3b147b0f\\x2da34f\\x2d43bf\\x2da4ef\\x2de30d56b04273.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node11] => (item=systemd-cryptsetup@luk...da34f\x2d43bf\x2da4ef\x2de30d56b04273.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...da34f\\x2d43bf\\x2da4ef\\x2de30d56b04273.service", "name": "systemd-cryptsetup@luk...da34f\\x2d43bf\\x2da4ef\\x2de30d56b04273.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...da34f\\x2d43bf\\x2da4ef\\x2de30d56b04273.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...da34f\\x2d43bf\\x2da4ef\\x2de30d56b04273.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...da34f\\x2d43bf\\x2da4ef\\x2de30d56b04273.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...da34f\\x2d43bf\\x2da4ef\\x2de30d56b04273.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Monday 16 June 2025 17:16:51 -0400 (0:00:03.673) 0:09:57.904 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Monday 16 June 2025 17:16:51 -0400 (0:00:00.267) 0:09:58.172 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Monday 16 June 2025 17:16:51 -0400 (0:00:00.356) 0:09:58.528 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Monday 16 June 2025 17:16:52 -0400 (0:00:00.327) 0:09:58.855 *********** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1750108574.1714694, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1750108574.1714694, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1750108574.1714694, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "3373683278", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Monday 16 June 2025 17:16:53 -0400 (0:00:01.556) 0:10:00.412 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:272 Monday 16 June 2025 17:16:54 -0400 (0:00:00.251) 0:10:00.664 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 16 June 2025 17:16:54 -0400 (0:00:00.895) 0:10:01.559 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 16 June 2025 17:16:55 -0400 (0:00:00.282) 0:10:01.841 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 16 June 2025 17:16:55 -0400 (0:00:00.253) 0:10:02.094 *********** skipping: [managed-node11] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node11] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Monday 16 June 2025 17:16:55 -0400 (0:00:00.364) 0:10:02.459 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Monday 16 June 2025 17:16:56 -0400 (0:00:00.207) 0:10:02.666 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 16 June 2025 17:16:56 -0400 (0:00:00.251) 0:10:02.918 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 16 June 2025 17:16:56 -0400 (0:00:00.244) 0:10:03.162 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 16 June 2025 17:16:56 -0400 (0:00:00.181) 0:10:03.344 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Monday 16 June 2025 17:16:57 -0400 (0:00:00.614) 0:10:03.958 *********** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 16 June 2025 17:17:02 -0400 (0:00:04.993) 0:10:08.952 *********** ok: [managed-node11] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": false, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Monday 16 June 2025 17:17:02 -0400 (0:00:00.312) 0:10:09.264 *********** ok: [managed-node11] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Monday 16 June 2025 17:17:02 -0400 (0:00:00.310) 0:10:09.574 *********** ok: [managed-node11] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 16 June 2025 17:17:08 -0400 (0:00:05.645) 0:10:15.220 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 16 June 2025 17:17:08 -0400 (0:00:00.395) 0:10:15.615 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 16 June 2025 17:17:09 -0400 (0:00:00.206) 0:10:15.822 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 16 June 2025 17:17:09 -0400 (0:00:00.243) 0:10:16.065 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Monday 16 June 2025 17:17:09 -0400 (0:00:00.174) 0:10:16.239 *********** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Monday 16 June 2025 17:17:14 -0400 (0:00:04.690) 0:10:20.930 *********** ok: [managed-node11] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.service": { "name": "systemd-cryptsetup@luk...d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2d676decb8\\x2d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.service": { "name": "systemd-cryptsetup@luks\\x2d676decb8\\x2d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Monday 16 June 2025 17:17:17 -0400 (0:00:02.912) 0:10:23.843 *********** ok: [managed-node11] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d676decb8\\x2d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.service", "systemd-cryptsetup@luk...d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Monday 16 June 2025 17:17:17 -0400 (0:00:00.376) 0:10:24.220 *********** changed: [managed-node11] => (item=systemd-cryptsetup@luks\x2d676decb8\x2d5a23\x2d4d42\x2dba5c\x2d1b3b350e8738.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d676decb8\\x2d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.service", "name": "systemd-cryptsetup@luks\\x2d676decb8\\x2d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "system-systemd\\x2dcryptsetup.slice systemd-journald.socket dev-sda1.device cryptsetup-pre.target", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-676decb8-5a23-4d42-ba5c-1b3b350e8738", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-676decb8-5a23-4d42-ba5c-1b3b350e8738 /dev/sda1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-676decb8-5a23-4d42-ba5c-1b3b350e8738 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d676decb8\\x2d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d676decb8\\x2d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d676decb8\\x2d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "dev-mapper-luks\\x2d676decb8\\x2d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.device cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Mon 2025-06-16 17:16:41 EDT", "StateChangeTimestampMonotonic": "2237919280", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node11] => (item=systemd-cryptsetup@luk...d5a23\x2d4d42\x2dba5c\x2d1b3b350e8738.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.service", "name": "systemd-cryptsetup@luk...d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Monday 16 June 2025 17:17:22 -0400 (0:00:04.517) 0:10:28.737 *********** changed: [managed-node11] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-676decb8-5a23-4d42-ba5c-1b3b350e8738", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-676decb8-5a23-4d42-ba5c-1b3b350e8738", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-676decb8-5a23-4d42-ba5c-1b3b350e8738", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-676decb8-5a23-4d42-ba5c-1b3b350e8738", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=120ce2df-1c94-4c84-b109-58098c507737", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=120ce2df-1c94-4c84-b109-58098c507737", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Monday 16 June 2025 17:17:27 -0400 (0:00:05.717) 0:10:34.454 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Monday 16 June 2025 17:17:28 -0400 (0:00:00.283) 0:10:34.738 *********** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1750108507.0149314, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "5de12056a122db4f83cba51aa8ce1366c4caa1e8", "ctime": 1750108507.0119314, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 364904586, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1750108507.0119314, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "4217545767", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Monday 16 June 2025 17:17:29 -0400 (0:00:01.653) 0:10:36.392 *********** ok: [managed-node11] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 16 June 2025 17:17:31 -0400 (0:00:01.517) 0:10:37.910 *********** changed: [managed-node11] => (item=systemd-cryptsetup@luks\x2d676decb8\x2d5a23\x2d4d42\x2dba5c\x2d1b3b350e8738.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d676decb8\\x2d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.service", "name": "systemd-cryptsetup@luks\\x2d676decb8\\x2d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d676decb8\\x2d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d676decb8\\x2d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d676decb8\\x2d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d676decb8\\x2d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "RequiredBy": "dev-mapper-luks\\x2d676decb8\\x2d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.device cryptsetup.target", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Mon 2025-06-16 17:16:41 EDT", "StateChangeTimestampMonotonic": "2237919280", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node11] => (item=systemd-cryptsetup@luk...d5a23\x2d4d42\x2dba5c\x2d1b3b350e8738.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.service", "name": "systemd-cryptsetup@luk...d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Monday 16 June 2025 17:17:34 -0400 (0:00:03.614) 0:10:41.524 *********** ok: [managed-node11] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-676decb8-5a23-4d42-ba5c-1b3b350e8738", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-676decb8-5a23-4d42-ba5c-1b3b350e8738", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-676decb8-5a23-4d42-ba5c-1b3b350e8738", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-676decb8-5a23-4d42-ba5c-1b3b350e8738", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=120ce2df-1c94-4c84-b109-58098c507737", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=120ce2df-1c94-4c84-b109-58098c507737", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Monday 16 June 2025 17:17:35 -0400 (0:00:00.313) 0:10:41.838 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=120ce2df-1c94-4c84-b109-58098c507737", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Monday 16 June 2025 17:17:35 -0400 (0:00:00.274) 0:10:42.112 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Monday 16 June 2025 17:17:35 -0400 (0:00:00.198) 0:10:42.311 *********** changed: [managed-node11] => (item={'src': '/dev/mapper/luks-676decb8-5a23-4d42-ba5c-1b3b350e8738', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-676decb8-5a23-4d42-ba5c-1b3b350e8738", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-676decb8-5a23-4d42-ba5c-1b3b350e8738" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Monday 16 June 2025 17:17:37 -0400 (0:00:01.636) 0:10:43.947 *********** ok: [managed-node11] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Monday 16 June 2025 17:17:39 -0400 (0:00:01.712) 0:10:45.660 *********** changed: [managed-node11] => (item={'src': 'UUID=120ce2df-1c94-4c84-b109-58098c507737', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=120ce2df-1c94-4c84-b109-58098c507737", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=120ce2df-1c94-4c84-b109-58098c507737" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Monday 16 June 2025 17:17:40 -0400 (0:00:01.276) 0:10:46.936 *********** skipping: [managed-node11] => (item={'src': 'UUID=120ce2df-1c94-4c84-b109-58098c507737', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=120ce2df-1c94-4c84-b109-58098c507737", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Monday 16 June 2025 17:17:40 -0400 (0:00:00.188) 0:10:47.124 *********** ok: [managed-node11] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Monday 16 June 2025 17:17:41 -0400 (0:00:01.269) 0:10:48.394 *********** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1750108520.7180414, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "00570fafdcf9afcaf12c96a08351fbec19a093b9", "ctime": 1750108513.5609841, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 54526215, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1750108513.5609841, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 54, "uid": 0, "version": "3471205244", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Monday 16 June 2025 17:17:42 -0400 (0:00:01.232) 0:10:49.627 *********** changed: [managed-node11] => (item={'backing_device': '/dev/sda1', 'name': 'luks-676decb8-5a23-4d42-ba5c-1b3b350e8738', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-676decb8-5a23-4d42-ba5c-1b3b350e8738", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Monday 16 June 2025 17:17:44 -0400 (0:00:01.485) 0:10:51.112 *********** ok: [managed-node11] TASK [Verify role results] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:290 Monday 16 June 2025 17:17:46 -0400 (0:00:01.793) 0:10:52.906 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node11 TASK [Print out pool information] ********************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Monday 16 June 2025 17:17:47 -0400 (0:00:01.096) 0:10:54.003 *********** ok: [managed-node11] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=120ce2df-1c94-4c84-b109-58098c507737", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Monday 16 June 2025 17:17:47 -0400 (0:00:00.265) 0:10:54.268 *********** skipping: [managed-node11] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Monday 16 June 2025 17:17:47 -0400 (0:00:00.165) 0:10:54.433 *********** ok: [managed-node11] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "120ce2df-1c94-4c84-b109-58098c507737" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Monday 16 June 2025 17:17:49 -0400 (0:00:01.353) 0:10:55.787 *********** ok: [managed-node11] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002878", "end": "2025-06-16 17:17:50.344851", "rc": 0, "start": "2025-06-16 17:17:50.341973" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 UUID=120ce2df-1c94-4c84-b109-58098c507737 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Monday 16 June 2025 17:17:50 -0400 (0:00:01.461) 0:10:57.248 *********** ok: [managed-node11] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003235", "end": "2025-06-16 17:17:51.470119", "failed_when_result": false, "rc": 0, "start": "2025-06-16 17:17:51.466884" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Monday 16 June 2025 17:17:51 -0400 (0:00:01.180) 0:10:58.428 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node11 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Monday 16 June 2025 17:17:52 -0400 (0:00:00.447) 0:10:58.875 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Monday 16 June 2025 17:17:52 -0400 (0:00:00.216) 0:10:59.092 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Monday 16 June 2025 17:17:52 -0400 (0:00:00.190) 0:10:59.283 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Monday 16 June 2025 17:17:52 -0400 (0:00:00.273) 0:10:59.556 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node11 TASK [Set test variables] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Monday 16 June 2025 17:17:53 -0400 (0:00:00.626) 0:11:00.182 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Monday 16 June 2025 17:17:53 -0400 (0:00:00.244) 0:11:00.427 *********** TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Monday 16 June 2025 17:17:53 -0400 (0:00:00.175) 0:11:00.602 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Monday 16 June 2025 17:17:54 -0400 (0:00:00.268) 0:11:00.871 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Monday 16 June 2025 17:17:54 -0400 (0:00:00.341) 0:11:01.212 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Monday 16 June 2025 17:17:54 -0400 (0:00:00.296) 0:11:01.509 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Monday 16 June 2025 17:17:55 -0400 (0:00:00.230) 0:11:01.740 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Monday 16 June 2025 17:17:55 -0400 (0:00:00.234) 0:11:01.975 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:54 Monday 16 June 2025 17:17:55 -0400 (0:00:00.222) 0:11:02.197 *********** TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:67 Monday 16 June 2025 17:17:55 -0400 (0:00:00.142) 0:11:02.339 *********** ok: [managed-node11] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.14.162 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:77 Monday 16 June 2025 17:17:57 -0400 (0:00:01.525) 0:11:03.865 *********** TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:87 Monday 16 June 2025 17:17:57 -0400 (0:00:00.238) 0:11:04.104 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node11 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Monday 16 June 2025 17:17:57 -0400 (0:00:00.472) 0:11:04.576 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Monday 16 June 2025 17:17:58 -0400 (0:00:00.351) 0:11:04.927 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Monday 16 June 2025 17:17:58 -0400 (0:00:00.329) 0:11:05.257 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Monday 16 June 2025 17:17:58 -0400 (0:00:00.172) 0:11:05.429 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Monday 16 June 2025 17:17:59 -0400 (0:00:00.273) 0:11:05.702 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Monday 16 June 2025 17:17:59 -0400 (0:00:00.219) 0:11:05.922 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Monday 16 June 2025 17:17:59 -0400 (0:00:00.179) 0:11:06.101 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Monday 16 June 2025 17:17:59 -0400 (0:00:00.229) 0:11:06.331 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Monday 16 June 2025 17:17:59 -0400 (0:00:00.229) 0:11:06.561 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Monday 16 June 2025 17:18:00 -0400 (0:00:00.237) 0:11:06.799 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Monday 16 June 2025 17:18:00 -0400 (0:00:00.229) 0:11:07.028 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:90 Monday 16 June 2025 17:18:00 -0400 (0:00:00.168) 0:11:07.197 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node11 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Monday 16 June 2025 17:18:01 -0400 (0:00:00.472) 0:11:07.670 *********** skipping: [managed-node11] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=120ce2df-1c94-4c84-b109-58098c507737', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=120ce2df-1c94-4c84-b109-58098c507737", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:93 Monday 16 June 2025 17:18:01 -0400 (0:00:00.586) 0:11:08.256 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node11 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Monday 16 June 2025 17:18:02 -0400 (0:00:00.484) 0:11:08.741 *********** skipping: [managed-node11] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=120ce2df-1c94-4c84-b109-58098c507737', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=120ce2df-1c94-4c84-b109-58098c507737", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:96 Monday 16 June 2025 17:18:02 -0400 (0:00:00.299) 0:11:09.040 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node11 TASK [Set test variables] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Monday 16 June 2025 17:18:03 -0400 (0:00:00.592) 0:11:09.633 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Monday 16 June 2025 17:18:03 -0400 (0:00:00.258) 0:11:09.892 *********** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Monday 16 June 2025 17:18:03 -0400 (0:00:00.333) 0:11:10.226 *********** TASK [Clear test variables] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Monday 16 June 2025 17:18:03 -0400 (0:00:00.245) 0:11:10.471 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:99 Monday 16 June 2025 17:18:04 -0400 (0:00:00.223) 0:11:10.695 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node11 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Monday 16 June 2025 17:18:04 -0400 (0:00:00.341) 0:11:11.036 *********** skipping: [managed-node11] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=120ce2df-1c94-4c84-b109-58098c507737', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=120ce2df-1c94-4c84-b109-58098c507737", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:102 Monday 16 June 2025 17:18:04 -0400 (0:00:00.239) 0:11:11.276 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node11 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Monday 16 June 2025 17:18:05 -0400 (0:00:00.421) 0:11:11.698 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Monday 16 June 2025 17:18:05 -0400 (0:00:00.290) 0:11:11.988 *********** skipping: [managed-node11] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Monday 16 June 2025 17:18:05 -0400 (0:00:00.259) 0:11:12.248 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Monday 16 June 2025 17:18:05 -0400 (0:00:00.253) 0:11:12.501 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Monday 16 June 2025 17:18:06 -0400 (0:00:00.255) 0:11:12.757 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Monday 16 June 2025 17:18:06 -0400 (0:00:00.269) 0:11:13.027 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Monday 16 June 2025 17:18:06 -0400 (0:00:00.174) 0:11:13.201 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:105 Monday 16 June 2025 17:18:06 -0400 (0:00:00.218) 0:11:13.420 *********** ok: [managed-node11] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Monday 16 June 2025 17:18:06 -0400 (0:00:00.178) 0:11:13.598 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node11 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Monday 16 June 2025 17:18:07 -0400 (0:00:00.266) 0:11:13.865 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Monday 16 June 2025 17:18:07 -0400 (0:00:00.214) 0:11:14.080 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node11 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Monday 16 June 2025 17:18:08 -0400 (0:00:01.272) 0:11:15.352 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_device_path": "/dev/sda1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Monday 16 June 2025 17:18:09 -0400 (0:00:00.287) 0:11:15.639 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Monday 16 June 2025 17:18:09 -0400 (0:00:00.342) 0:11:15.982 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Monday 16 June 2025 17:18:09 -0400 (0:00:00.314) 0:11:16.296 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Monday 16 June 2025 17:18:09 -0400 (0:00:00.184) 0:11:16.481 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Monday 16 June 2025 17:18:10 -0400 (0:00:00.243) 0:11:16.724 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Monday 16 June 2025 17:18:10 -0400 (0:00:00.317) 0:11:17.042 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Monday 16 June 2025 17:18:10 -0400 (0:00:00.229) 0:11:17.272 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Monday 16 June 2025 17:18:10 -0400 (0:00:00.273) 0:11:17.545 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Monday 16 June 2025 17:18:11 -0400 (0:00:00.308) 0:11:17.854 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Monday 16 June 2025 17:18:11 -0400 (0:00:00.313) 0:11:18.168 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Monday 16 June 2025 17:18:11 -0400 (0:00:00.206) 0:11:18.375 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=120ce2df-1c94-4c84-b109-58098c507737 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Monday 16 June 2025 17:18:12 -0400 (0:00:00.400) 0:11:18.775 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Monday 16 June 2025 17:18:12 -0400 (0:00:00.311) 0:11:19.087 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Monday 16 June 2025 17:18:13 -0400 (0:00:00.712) 0:11:19.800 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Monday 16 June 2025 17:18:13 -0400 (0:00:00.217) 0:11:20.017 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Monday 16 June 2025 17:18:13 -0400 (0:00:00.174) 0:11:20.192 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Monday 16 June 2025 17:18:13 -0400 (0:00:00.162) 0:11:20.355 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Monday 16 June 2025 17:18:14 -0400 (0:00:00.392) 0:11:20.748 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Monday 16 June 2025 17:18:14 -0400 (0:00:00.309) 0:11:21.058 *********** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1750108647.4940567, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1750108647.4940567, "dev": 6, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 182953, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1750108647.4940567, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Monday 16 June 2025 17:18:15 -0400 (0:00:01.562) 0:11:22.620 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Monday 16 June 2025 17:18:16 -0400 (0:00:00.395) 0:11:23.016 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Monday 16 June 2025 17:18:16 -0400 (0:00:00.194) 0:11:23.211 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Monday 16 June 2025 17:18:16 -0400 (0:00:00.303) 0:11:23.515 *********** ok: [managed-node11] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Monday 16 June 2025 17:18:17 -0400 (0:00:00.273) 0:11:23.788 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Monday 16 June 2025 17:18:17 -0400 (0:00:00.333) 0:11:24.122 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Monday 16 June 2025 17:18:17 -0400 (0:00:00.262) 0:11:24.384 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Monday 16 June 2025 17:18:17 -0400 (0:00:00.225) 0:11:24.610 *********** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Monday 16 June 2025 17:18:22 -0400 (0:00:04.367) 0:11:28.978 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Monday 16 June 2025 17:18:22 -0400 (0:00:00.297) 0:11:29.276 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Monday 16 June 2025 17:18:22 -0400 (0:00:00.216) 0:11:29.492 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Monday 16 June 2025 17:18:23 -0400 (0:00:00.322) 0:11:29.815 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Monday 16 June 2025 17:18:23 -0400 (0:00:00.216) 0:11:30.031 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Monday 16 June 2025 17:18:23 -0400 (0:00:00.239) 0:11:30.271 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Monday 16 June 2025 17:18:23 -0400 (0:00:00.296) 0:11:30.567 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Monday 16 June 2025 17:18:24 -0400 (0:00:00.316) 0:11:30.883 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Monday 16 June 2025 17:18:24 -0400 (0:00:00.241) 0:11:31.125 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Monday 16 June 2025 17:18:24 -0400 (0:00:00.202) 0:11:31.328 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Monday 16 June 2025 17:18:25 -0400 (0:00:00.493) 0:11:31.821 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Monday 16 June 2025 17:18:25 -0400 (0:00:00.317) 0:11:32.138 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Monday 16 June 2025 17:18:25 -0400 (0:00:00.273) 0:11:32.412 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Monday 16 June 2025 17:18:26 -0400 (0:00:00.287) 0:11:32.699 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Monday 16 June 2025 17:18:26 -0400 (0:00:00.214) 0:11:32.913 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Monday 16 June 2025 17:18:26 -0400 (0:00:00.415) 0:11:33.329 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Monday 16 June 2025 17:18:26 -0400 (0:00:00.294) 0:11:33.623 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Monday 16 June 2025 17:18:27 -0400 (0:00:00.272) 0:11:33.896 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Monday 16 June 2025 17:18:27 -0400 (0:00:00.197) 0:11:34.094 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Monday 16 June 2025 17:18:27 -0400 (0:00:00.189) 0:11:34.284 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Monday 16 June 2025 17:18:27 -0400 (0:00:00.144) 0:11:34.428 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Monday 16 June 2025 17:18:28 -0400 (0:00:00.228) 0:11:34.657 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Monday 16 June 2025 17:18:28 -0400 (0:00:00.252) 0:11:34.909 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Monday 16 June 2025 17:18:28 -0400 (0:00:00.143) 0:11:35.053 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Monday 16 June 2025 17:18:28 -0400 (0:00:00.087) 0:11:35.141 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Monday 16 June 2025 17:18:28 -0400 (0:00:00.185) 0:11:35.326 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Monday 16 June 2025 17:18:28 -0400 (0:00:00.155) 0:11:35.482 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Monday 16 June 2025 17:18:29 -0400 (0:00:00.206) 0:11:35.689 *********** ok: [managed-node11] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Monday 16 June 2025 17:18:29 -0400 (0:00:00.135) 0:11:35.824 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Monday 16 June 2025 17:18:29 -0400 (0:00:00.216) 0:11:36.041 *********** skipping: [managed-node11] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Monday 16 June 2025 17:18:29 -0400 (0:00:00.227) 0:11:36.268 *********** skipping: [managed-node11] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Monday 16 June 2025 17:18:29 -0400 (0:00:00.323) 0:11:36.591 *********** skipping: [managed-node11] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Monday 16 June 2025 17:18:30 -0400 (0:00:00.318) 0:11:36.910 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Monday 16 June 2025 17:18:30 -0400 (0:00:00.242) 0:11:37.152 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Monday 16 June 2025 17:18:30 -0400 (0:00:00.284) 0:11:37.437 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Monday 16 June 2025 17:18:31 -0400 (0:00:00.230) 0:11:37.668 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Monday 16 June 2025 17:18:31 -0400 (0:00:00.245) 0:11:37.913 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Monday 16 June 2025 17:18:31 -0400 (0:00:00.355) 0:11:38.269 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Monday 16 June 2025 17:18:31 -0400 (0:00:00.096) 0:11:38.366 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Monday 16 June 2025 17:18:31 -0400 (0:00:00.108) 0:11:38.474 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Monday 16 June 2025 17:18:32 -0400 (0:00:00.227) 0:11:38.702 *********** skipping: [managed-node11] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Monday 16 June 2025 17:18:32 -0400 (0:00:00.200) 0:11:38.902 *********** skipping: [managed-node11] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Monday 16 June 2025 17:18:32 -0400 (0:00:00.148) 0:11:39.051 *********** skipping: [managed-node11] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Monday 16 June 2025 17:18:32 -0400 (0:00:00.154) 0:11:39.205 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Monday 16 June 2025 17:18:32 -0400 (0:00:00.168) 0:11:39.374 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Monday 16 June 2025 17:18:32 -0400 (0:00:00.217) 0:11:39.592 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Monday 16 June 2025 17:18:33 -0400 (0:00:00.211) 0:11:39.803 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Monday 16 June 2025 17:18:33 -0400 (0:00:00.248) 0:11:40.052 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Monday 16 June 2025 17:18:33 -0400 (0:00:00.162) 0:11:40.215 *********** ok: [managed-node11] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Monday 16 June 2025 17:18:33 -0400 (0:00:00.163) 0:11:40.379 *********** ok: [managed-node11] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Monday 16 June 2025 17:18:33 -0400 (0:00:00.186) 0:11:40.565 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Monday 16 June 2025 17:18:34 -0400 (0:00:00.182) 0:11:40.748 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Monday 16 June 2025 17:18:34 -0400 (0:00:00.114) 0:11:40.863 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Monday 16 June 2025 17:18:34 -0400 (0:00:00.111) 0:11:40.975 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Monday 16 June 2025 17:18:34 -0400 (0:00:00.334) 0:11:41.310 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Monday 16 June 2025 17:18:34 -0400 (0:00:00.226) 0:11:41.536 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Monday 16 June 2025 17:18:35 -0400 (0:00:00.134) 0:11:41.670 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Monday 16 June 2025 17:18:35 -0400 (0:00:00.112) 0:11:41.783 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Monday 16 June 2025 17:18:35 -0400 (0:00:00.223) 0:11:42.006 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Monday 16 June 2025 17:18:35 -0400 (0:00:00.109) 0:11:42.115 *********** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Monday 16 June 2025 17:18:35 -0400 (0:00:00.235) 0:11:42.350 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Monday 16 June 2025 17:18:35 -0400 (0:00:00.216) 0:11:42.567 *********** changed: [managed-node11] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:296 Monday 16 June 2025 17:18:37 -0400 (0:00:01.403) 0:11:43.971 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node11 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Monday 16 June 2025 17:18:37 -0400 (0:00:00.386) 0:11:44.357 *********** ok: [managed-node11] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Monday 16 June 2025 17:18:37 -0400 (0:00:00.173) 0:11:44.531 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 16 June 2025 17:18:38 -0400 (0:00:00.181) 0:11:44.713 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 16 June 2025 17:18:38 -0400 (0:00:00.183) 0:11:44.896 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 16 June 2025 17:18:38 -0400 (0:00:00.269) 0:11:45.166 *********** skipping: [managed-node11] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node11] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Monday 16 June 2025 17:18:38 -0400 (0:00:00.320) 0:11:45.486 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Monday 16 June 2025 17:18:39 -0400 (0:00:00.188) 0:11:45.674 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 16 June 2025 17:18:39 -0400 (0:00:00.143) 0:11:45.817 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 16 June 2025 17:18:39 -0400 (0:00:00.103) 0:11:45.921 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 16 June 2025 17:18:39 -0400 (0:00:00.117) 0:11:46.039 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Monday 16 June 2025 17:18:39 -0400 (0:00:00.501) 0:11:46.540 *********** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 16 June 2025 17:18:44 -0400 (0:00:04.652) 0:11:51.193 *********** ok: [managed-node11] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Monday 16 June 2025 17:18:44 -0400 (0:00:00.388) 0:11:51.581 *********** ok: [managed-node11] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Monday 16 June 2025 17:18:45 -0400 (0:00:00.282) 0:11:51.864 *********** ok: [managed-node11] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 16 June 2025 17:18:50 -0400 (0:00:05.311) 0:11:57.175 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 16 June 2025 17:18:50 -0400 (0:00:00.379) 0:11:57.554 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 16 June 2025 17:18:51 -0400 (0:00:00.230) 0:11:57.784 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 16 June 2025 17:18:51 -0400 (0:00:00.244) 0:11:58.029 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Monday 16 June 2025 17:18:51 -0400 (0:00:00.164) 0:11:58.193 *********** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Monday 16 June 2025 17:18:56 -0400 (0:00:04.610) 0:12:02.803 *********** ok: [managed-node11] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.service": { "name": "systemd-cryptsetup@luk...d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2d676decb8\\x2d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.service": { "name": "systemd-cryptsetup@luks\\x2d676decb8\\x2d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Monday 16 June 2025 17:18:59 -0400 (0:00:03.050) 0:12:05.854 *********** ok: [managed-node11] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d676decb8\\x2d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.service", "systemd-cryptsetup@luk...d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Monday 16 June 2025 17:18:59 -0400 (0:00:00.329) 0:12:06.183 *********** changed: [managed-node11] => (item=systemd-cryptsetup@luks\x2d676decb8\x2d5a23\x2d4d42\x2dba5c\x2d1b3b350e8738.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d676decb8\\x2d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.service", "name": "systemd-cryptsetup@luks\\x2d676decb8\\x2d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "system-systemd\\x2dcryptsetup.slice dev-sda1.device cryptsetup-pre.target systemd-journald.socket", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-676decb8-5a23-4d42-ba5c-1b3b350e8738", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-676decb8-5a23-4d42-ba5c-1b3b350e8738 /dev/sda1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-676decb8-5a23-4d42-ba5c-1b3b350e8738 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d676decb8\\x2d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d676decb8\\x2d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d676decb8\\x2d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Mon 2025-06-16 17:16:41 EDT", "StateChangeTimestampMonotonic": "2237919280", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node11] => (item=systemd-cryptsetup@luk...d5a23\x2d4d42\x2dba5c\x2d1b3b350e8738.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.service", "name": "systemd-cryptsetup@luk...d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Monday 16 June 2025 17:19:03 -0400 (0:00:04.053) 0:12:10.236 *********** fatal: [managed-node11]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Monday 16 June 2025 17:19:09 -0400 (0:00:05.597) 0:12:15.834 *********** fatal: [managed-node11]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 16 June 2025 17:19:09 -0400 (0:00:00.298) 0:12:16.133 *********** changed: [managed-node11] => (item=systemd-cryptsetup@luks\x2d676decb8\x2d5a23\x2d4d42\x2dba5c\x2d1b3b350e8738.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d676decb8\\x2d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.service", "name": "systemd-cryptsetup@luks\\x2d676decb8\\x2d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d676decb8\\x2d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d676decb8\\x2d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d676decb8\\x2d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d676decb8\\x2d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node11] => (item=systemd-cryptsetup@luk...d5a23\x2d4d42\x2dba5c\x2d1b3b350e8738.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.service", "name": "systemd-cryptsetup@luk...d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d5a23\\x2d4d42\\x2dba5c\\x2d1b3b350e8738.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Monday 16 June 2025 17:19:12 -0400 (0:00:03.137) 0:12:19.270 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Monday 16 June 2025 17:19:12 -0400 (0:00:00.290) 0:12:19.561 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Monday 16 June 2025 17:19:13 -0400 (0:00:00.368) 0:12:19.929 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Monday 16 June 2025 17:19:13 -0400 (0:00:00.335) 0:12:20.265 *********** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1750108717.0586138, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1750108717.0586138, "dev": 2049, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1750108717.0586138, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "2252291874", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Monday 16 June 2025 17:19:15 -0400 (0:00:01.574) 0:12:21.839 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Create a key file] ******************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:323 Monday 16 June 2025 17:19:15 -0400 (0:00:00.193) 0:12:22.033 *********** ok: [managed-node11] => { "changed": false, "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/tmp/storage_testr1vo4i7mlukskey", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Write the key into the key file] ***************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:330 Monday 16 June 2025 17:19:17 -0400 (0:00:02.479) 0:12:24.512 *********** ok: [managed-node11] => { "changed": false, "checksum": "7a4dff3752e2baf5617c57eaac048e2b95e8af91", "dest": "/tmp/storage_testr1vo4i7mlukskey", "gid": 0, "group": "root", "md5sum": "4ac07b967150835c00d0865161e48744", "mode": "0600", "owner": "root", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 32, "src": "/root/.ansible/tmp/ansible-tmp-1750108758.186707-159080-258566385957334/source", "state": "file", "uid": 0 } TASK [Add encryption to the volume] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:337 Monday 16 June 2025 17:19:21 -0400 (0:00:03.476) 0:12:27.989 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 16 June 2025 17:19:21 -0400 (0:00:00.249) 0:12:28.238 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 16 June 2025 17:19:21 -0400 (0:00:00.378) 0:12:28.617 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 16 June 2025 17:19:22 -0400 (0:00:00.156) 0:12:28.773 *********** skipping: [managed-node11] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node11] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Monday 16 June 2025 17:19:22 -0400 (0:00:00.439) 0:12:29.213 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Monday 16 June 2025 17:19:23 -0400 (0:00:00.432) 0:12:29.645 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 16 June 2025 17:19:23 -0400 (0:00:00.234) 0:12:29.880 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 16 June 2025 17:19:23 -0400 (0:00:00.199) 0:12:30.079 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 16 June 2025 17:19:23 -0400 (0:00:00.195) 0:12:30.275 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Monday 16 June 2025 17:19:24 -0400 (0:00:00.569) 0:12:30.845 *********** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 16 June 2025 17:19:29 -0400 (0:00:04.816) 0:12:35.661 *********** ok: [managed-node11] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_key": "/tmp/storage_testr1vo4i7mlukskey", "encryption_luks_version": "luks2", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Monday 16 June 2025 17:19:29 -0400 (0:00:00.342) 0:12:36.004 *********** ok: [managed-node11] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Monday 16 June 2025 17:19:29 -0400 (0:00:00.193) 0:12:36.198 *********** ok: [managed-node11] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 16 June 2025 17:19:35 -0400 (0:00:05.439) 0:12:41.638 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 16 June 2025 17:19:35 -0400 (0:00:00.313) 0:12:41.951 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 16 June 2025 17:19:35 -0400 (0:00:00.205) 0:12:42.156 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 16 June 2025 17:19:35 -0400 (0:00:00.188) 0:12:42.345 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Monday 16 June 2025 17:19:35 -0400 (0:00:00.215) 0:12:42.560 *********** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Monday 16 June 2025 17:19:40 -0400 (0:00:04.822) 0:12:47.383 *********** ok: [managed-node11] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Monday 16 June 2025 17:19:44 -0400 (0:00:03.336) 0:12:50.719 *********** ok: [managed-node11] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Monday 16 June 2025 17:19:44 -0400 (0:00:00.409) 0:12:51.129 *********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Monday 16 June 2025 17:19:44 -0400 (0:00:00.186) 0:12:51.315 *********** changed: [managed-node11] => { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-a469acf8-6556-4ce0-8cc9-0fe0015186de", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-a469acf8-6556-4ce0-8cc9-0fe0015186de", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-a469acf8-6556-4ce0-8cc9-0fe0015186de", "password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-a469acf8-6556-4ce0-8cc9-0fe0015186de" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=120ce2df-1c94-4c84-b109-58098c507737", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-a469acf8-6556-4ce0-8cc9-0fe0015186de", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-a469acf8-6556-4ce0-8cc9-0fe0015186de", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-a469acf8-6556-4ce0-8cc9-0fe0015186de", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Monday 16 June 2025 17:19:58 -0400 (0:00:13.378) 0:13:04.694 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Monday 16 June 2025 17:19:58 -0400 (0:00:00.205) 0:13:04.900 *********** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1750108660.0721574, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "05c06d22b60351e7a76fe30fa5ec18764533cfb6", "ctime": 1750108660.0691574, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 364904586, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1750108660.0691574, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1413, "uid": 0, "version": "4217545767", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Monday 16 June 2025 17:19:59 -0400 (0:00:01.516) 0:13:06.416 *********** ok: [managed-node11] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 16 June 2025 17:20:01 -0400 (0:00:01.362) 0:13:07.778 *********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Monday 16 June 2025 17:20:01 -0400 (0:00:00.144) 0:13:07.923 *********** ok: [managed-node11] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-a469acf8-6556-4ce0-8cc9-0fe0015186de", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-a469acf8-6556-4ce0-8cc9-0fe0015186de", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-a469acf8-6556-4ce0-8cc9-0fe0015186de", "password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-a469acf8-6556-4ce0-8cc9-0fe0015186de" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=120ce2df-1c94-4c84-b109-58098c507737", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-a469acf8-6556-4ce0-8cc9-0fe0015186de", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-a469acf8-6556-4ce0-8cc9-0fe0015186de", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-a469acf8-6556-4ce0-8cc9-0fe0015186de", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Monday 16 June 2025 17:20:01 -0400 (0:00:00.258) 0:13:08.182 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-a469acf8-6556-4ce0-8cc9-0fe0015186de", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-a469acf8-6556-4ce0-8cc9-0fe0015186de", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Monday 16 June 2025 17:20:01 -0400 (0:00:00.264) 0:13:08.447 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Monday 16 June 2025 17:20:02 -0400 (0:00:00.311) 0:13:08.758 *********** changed: [managed-node11] => (item={'src': 'UUID=120ce2df-1c94-4c84-b109-58098c507737', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=120ce2df-1c94-4c84-b109-58098c507737", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=120ce2df-1c94-4c84-b109-58098c507737" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Monday 16 June 2025 17:20:03 -0400 (0:00:01.867) 0:13:10.626 *********** ok: [managed-node11] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Monday 16 June 2025 17:20:06 -0400 (0:00:02.060) 0:13:12.686 *********** changed: [managed-node11] => (item={'src': '/dev/mapper/luks-a469acf8-6556-4ce0-8cc9-0fe0015186de', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-a469acf8-6556-4ce0-8cc9-0fe0015186de", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-a469acf8-6556-4ce0-8cc9-0fe0015186de" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Monday 16 June 2025 17:20:07 -0400 (0:00:01.812) 0:13:14.498 *********** skipping: [managed-node11] => (item={'src': '/dev/mapper/luks-a469acf8-6556-4ce0-8cc9-0fe0015186de', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-a469acf8-6556-4ce0-8cc9-0fe0015186de", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Monday 16 June 2025 17:20:08 -0400 (0:00:00.324) 0:13:14.823 *********** ok: [managed-node11] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Monday 16 June 2025 17:20:10 -0400 (0:00:01.934) 0:13:16.757 *********** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1750108671.4692488, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1750108664.11619, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 209715404, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1750108664.1151898, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "2884407654", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Monday 16 June 2025 17:20:11 -0400 (0:00:01.481) 0:13:18.238 *********** changed: [managed-node11] => (item={'backing_device': '/dev/sda1', 'name': 'luks-a469acf8-6556-4ce0-8cc9-0fe0015186de', 'password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-a469acf8-6556-4ce0-8cc9-0fe0015186de", "password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Monday 16 June 2025 17:20:13 -0400 (0:00:01.420) 0:13:19.659 *********** ok: [managed-node11] TASK [Verify role results] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:355 Monday 16 June 2025 17:20:14 -0400 (0:00:01.637) 0:13:21.297 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node11 TASK [Print out pool information] ********************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Monday 16 June 2025 17:20:15 -0400 (0:00:00.453) 0:13:21.750 *********** ok: [managed-node11] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-a469acf8-6556-4ce0-8cc9-0fe0015186de", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-a469acf8-6556-4ce0-8cc9-0fe0015186de", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Monday 16 June 2025 17:20:15 -0400 (0:00:00.298) 0:13:22.048 *********** skipping: [managed-node11] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Monday 16 June 2025 17:20:15 -0400 (0:00:00.261) 0:13:22.310 *********** ok: [managed-node11] => { "changed": false, "info": { "/dev/mapper/luks-a469acf8-6556-4ce0-8cc9-0fe0015186de": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-a469acf8-6556-4ce0-8cc9-0fe0015186de", "size": "10G", "type": "crypt", "uuid": "144065e9-4967-4468-aed6-758a4ef1d3c3" }, "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "a469acf8-6556-4ce0-8cc9-0fe0015186de" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Monday 16 June 2025 17:20:17 -0400 (0:00:01.572) 0:13:23.883 *********** ok: [managed-node11] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:01.003605", "end": "2025-06-16 17:20:19.326684", "rc": 0, "start": "2025-06-16 17:20:18.323079" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-a469acf8-6556-4ce0-8cc9-0fe0015186de /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Monday 16 June 2025 17:20:19 -0400 (0:00:02.418) 0:13:26.301 *********** ok: [managed-node11] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002672", "end": "2025-06-16 17:20:20.618792", "failed_when_result": false, "rc": 0, "start": "2025-06-16 17:20:20.616120" } STDOUT: luks-a469acf8-6556-4ce0-8cc9-0fe0015186de /dev/sda1 VALUE_SPECIFIED_IN_NO_LOG_PARAMETER TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Monday 16 June 2025 17:20:20 -0400 (0:00:01.186) 0:13:27.488 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node11 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Monday 16 June 2025 17:20:21 -0400 (0:00:00.903) 0:13:28.392 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Monday 16 June 2025 17:20:22 -0400 (0:00:00.258) 0:13:28.650 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Monday 16 June 2025 17:20:22 -0400 (0:00:00.302) 0:13:28.953 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Monday 16 June 2025 17:20:22 -0400 (0:00:00.299) 0:13:29.253 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node11 TASK [Set test variables] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Monday 16 June 2025 17:20:23 -0400 (0:00:00.664) 0:13:29.917 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Monday 16 June 2025 17:20:23 -0400 (0:00:00.275) 0:13:30.193 *********** TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Monday 16 June 2025 17:20:23 -0400 (0:00:00.188) 0:13:30.381 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Monday 16 June 2025 17:20:24 -0400 (0:00:00.352) 0:13:30.733 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Monday 16 June 2025 17:20:24 -0400 (0:00:00.211) 0:13:30.945 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Monday 16 June 2025 17:20:24 -0400 (0:00:00.206) 0:13:31.151 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Monday 16 June 2025 17:20:24 -0400 (0:00:00.146) 0:13:31.298 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Monday 16 June 2025 17:20:24 -0400 (0:00:00.221) 0:13:31.519 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:54 Monday 16 June 2025 17:20:25 -0400 (0:00:00.241) 0:13:31.761 *********** TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:67 Monday 16 June 2025 17:20:25 -0400 (0:00:00.100) 0:13:31.861 *********** ok: [managed-node11] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.14.162 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:77 Monday 16 June 2025 17:20:26 -0400 (0:00:01.705) 0:13:33.567 *********** TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:87 Monday 16 June 2025 17:20:27 -0400 (0:00:00.219) 0:13:33.787 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node11 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Monday 16 June 2025 17:20:27 -0400 (0:00:00.672) 0:13:34.459 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Monday 16 June 2025 17:20:28 -0400 (0:00:00.337) 0:13:34.797 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Monday 16 June 2025 17:20:28 -0400 (0:00:00.310) 0:13:35.107 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Monday 16 June 2025 17:20:28 -0400 (0:00:00.296) 0:13:35.404 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Monday 16 June 2025 17:20:29 -0400 (0:00:00.323) 0:13:35.728 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Monday 16 June 2025 17:20:29 -0400 (0:00:00.222) 0:13:35.950 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Monday 16 June 2025 17:20:29 -0400 (0:00:00.195) 0:13:36.145 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Monday 16 June 2025 17:20:29 -0400 (0:00:00.328) 0:13:36.473 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Monday 16 June 2025 17:20:30 -0400 (0:00:00.312) 0:13:36.786 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Monday 16 June 2025 17:20:30 -0400 (0:00:00.190) 0:13:36.976 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Monday 16 June 2025 17:20:30 -0400 (0:00:00.291) 0:13:37.267 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:90 Monday 16 June 2025 17:20:30 -0400 (0:00:00.320) 0:13:37.588 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node11 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Monday 16 June 2025 17:20:31 -0400 (0:00:00.549) 0:13:38.137 *********** skipping: [managed-node11] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-a469acf8-6556-4ce0-8cc9-0fe0015186de', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-a469acf8-6556-4ce0-8cc9-0fe0015186de', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/mapper/luks-a469acf8-6556-4ce0-8cc9-0fe0015186de", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-a469acf8-6556-4ce0-8cc9-0fe0015186de", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:93 Monday 16 June 2025 17:20:31 -0400 (0:00:00.273) 0:13:38.411 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node11 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Monday 16 June 2025 17:20:32 -0400 (0:00:00.456) 0:13:38.868 *********** skipping: [managed-node11] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-a469acf8-6556-4ce0-8cc9-0fe0015186de', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-a469acf8-6556-4ce0-8cc9-0fe0015186de', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/mapper/luks-a469acf8-6556-4ce0-8cc9-0fe0015186de", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-a469acf8-6556-4ce0-8cc9-0fe0015186de", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:96 Monday 16 June 2025 17:20:32 -0400 (0:00:00.251) 0:13:39.119 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node11 TASK [Set test variables] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Monday 16 June 2025 17:20:33 -0400 (0:00:00.577) 0:13:39.697 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Monday 16 June 2025 17:20:33 -0400 (0:00:00.275) 0:13:39.973 *********** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Monday 16 June 2025 17:20:33 -0400 (0:00:00.252) 0:13:40.225 *********** TASK [Clear test variables] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Monday 16 June 2025 17:20:33 -0400 (0:00:00.261) 0:13:40.487 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:99 Monday 16 June 2025 17:20:34 -0400 (0:00:00.231) 0:13:40.719 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node11 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Monday 16 June 2025 17:20:34 -0400 (0:00:00.634) 0:13:41.353 *********** skipping: [managed-node11] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-a469acf8-6556-4ce0-8cc9-0fe0015186de', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-a469acf8-6556-4ce0-8cc9-0fe0015186de', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/mapper/luks-a469acf8-6556-4ce0-8cc9-0fe0015186de", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-a469acf8-6556-4ce0-8cc9-0fe0015186de", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:102 Monday 16 June 2025 17:20:34 -0400 (0:00:00.228) 0:13:41.581 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node11 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Monday 16 June 2025 17:20:35 -0400 (0:00:00.553) 0:13:42.134 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Monday 16 June 2025 17:20:36 -0400 (0:00:00.771) 0:13:42.906 *********** skipping: [managed-node11] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Monday 16 June 2025 17:20:36 -0400 (0:00:00.272) 0:13:43.179 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Monday 16 June 2025 17:20:36 -0400 (0:00:00.211) 0:13:43.391 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Monday 16 June 2025 17:20:37 -0400 (0:00:00.267) 0:13:43.658 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Monday 16 June 2025 17:20:37 -0400 (0:00:00.225) 0:13:43.883 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Monday 16 June 2025 17:20:37 -0400 (0:00:00.281) 0:13:44.165 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:105 Monday 16 June 2025 17:20:37 -0400 (0:00:00.219) 0:13:44.384 *********** ok: [managed-node11] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Monday 16 June 2025 17:20:38 -0400 (0:00:00.254) 0:13:44.639 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node11 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Monday 16 June 2025 17:20:38 -0400 (0:00:00.517) 0:13:45.157 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Monday 16 June 2025 17:20:38 -0400 (0:00:00.281) 0:13:45.438 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node11 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Monday 16 June 2025 17:20:40 -0400 (0:00:01.274) 0:13:46.713 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-a469acf8-6556-4ce0-8cc9-0fe0015186de" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Monday 16 June 2025 17:20:40 -0400 (0:00:00.215) 0:13:46.929 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Monday 16 June 2025 17:20:40 -0400 (0:00:00.228) 0:13:47.158 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Monday 16 June 2025 17:20:40 -0400 (0:00:00.445) 0:13:47.603 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Monday 16 June 2025 17:20:41 -0400 (0:00:00.313) 0:13:47.916 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Monday 16 June 2025 17:20:41 -0400 (0:00:00.229) 0:13:48.146 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Monday 16 June 2025 17:20:41 -0400 (0:00:00.240) 0:13:48.386 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Monday 16 June 2025 17:20:42 -0400 (0:00:00.282) 0:13:48.669 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Monday 16 June 2025 17:20:42 -0400 (0:00:00.188) 0:13:48.857 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Monday 16 June 2025 17:20:42 -0400 (0:00:00.166) 0:13:49.023 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Monday 16 June 2025 17:20:42 -0400 (0:00:00.324) 0:13:49.347 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Monday 16 June 2025 17:20:42 -0400 (0:00:00.240) 0:13:49.588 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-a469acf8-6556-4ce0-8cc9-0fe0015186de " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Monday 16 June 2025 17:20:43 -0400 (0:00:00.324) 0:13:49.913 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Monday 16 June 2025 17:20:43 -0400 (0:00:00.255) 0:13:50.168 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Monday 16 June 2025 17:20:43 -0400 (0:00:00.272) 0:13:50.441 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Monday 16 June 2025 17:20:44 -0400 (0:00:00.244) 0:13:50.685 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Monday 16 June 2025 17:20:44 -0400 (0:00:00.273) 0:13:50.958 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Monday 16 June 2025 17:20:44 -0400 (0:00:00.269) 0:13:51.227 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Monday 16 June 2025 17:20:44 -0400 (0:00:00.359) 0:13:51.587 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Monday 16 June 2025 17:20:45 -0400 (0:00:00.256) 0:13:51.843 *********** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1750108797.6422591, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1750108797.6422591, "dev": 6, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 182953, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1750108797.6422591, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Monday 16 June 2025 17:20:46 -0400 (0:00:01.553) 0:13:53.397 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Monday 16 June 2025 17:20:46 -0400 (0:00:00.169) 0:13:53.566 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Monday 16 June 2025 17:20:47 -0400 (0:00:00.391) 0:13:53.958 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Monday 16 June 2025 17:20:47 -0400 (0:00:00.207) 0:13:54.166 *********** ok: [managed-node11] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Monday 16 June 2025 17:20:47 -0400 (0:00:00.214) 0:13:54.380 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Monday 16 June 2025 17:20:47 -0400 (0:00:00.175) 0:13:54.556 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Monday 16 June 2025 17:20:48 -0400 (0:00:00.197) 0:13:54.753 *********** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1750108797.8012605, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1750108797.8012605, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 214970, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1750108797.8012605, "nlink": 1, "path": "/dev/mapper/luks-a469acf8-6556-4ce0-8cc9-0fe0015186de", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Monday 16 June 2025 17:20:49 -0400 (0:00:01.150) 0:13:55.903 *********** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Monday 16 June 2025 17:20:53 -0400 (0:00:04.000) 0:13:59.904 *********** ok: [managed-node11] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda1" ], "delta": "0:00:00.010433", "end": "2025-06-16 17:20:54.339701", "rc": 0, "start": "2025-06-16 17:20:54.329268" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: a469acf8-6556-4ce0-8cc9-0fe0015186de Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 926692 Threads: 2 Salt: d4 78 e0 ad d9 5f 0b 9c c9 7a cc d5 83 91 d4 67 aa 05 b6 d0 19 ed 86 96 ce 18 b1 1d 7d 84 7f 45 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 120692 Salt: 10 9f 60 38 97 1d 5b c0 47 23 7a f4 a5 16 80 89 36 21 90 57 50 1b b6 93 1c d4 a2 4d f8 86 06 f4 Digest: bd 2c a7 c7 4a cb 65 9b 09 85 fc ca 23 f9 ee ff f7 35 5d d4 97 b5 cb c1 55 7f 2a 71 22 20 08 03 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Monday 16 June 2025 17:20:54 -0400 (0:00:01.368) 0:14:01.272 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Monday 16 June 2025 17:20:54 -0400 (0:00:00.293) 0:14:01.566 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Monday 16 June 2025 17:20:55 -0400 (0:00:00.280) 0:14:01.847 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Monday 16 June 2025 17:20:55 -0400 (0:00:00.227) 0:14:02.075 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Monday 16 June 2025 17:20:55 -0400 (0:00:00.243) 0:14:02.318 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Monday 16 June 2025 17:20:56 -0400 (0:00:00.459) 0:14:02.778 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Monday 16 June 2025 17:20:56 -0400 (0:00:00.211) 0:14:02.989 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Monday 16 June 2025 17:20:56 -0400 (0:00:00.243) 0:14:03.233 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-a469acf8-6556-4ce0-8cc9-0fe0015186de /dev/sda1 VALUE_SPECIFIED_IN_NO_LOG_PARAMETER" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Monday 16 June 2025 17:20:56 -0400 (0:00:00.336) 0:14:03.569 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Monday 16 June 2025 17:20:57 -0400 (0:00:00.300) 0:14:03.869 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Monday 16 June 2025 17:20:57 -0400 (0:00:00.286) 0:14:04.156 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Monday 16 June 2025 17:20:57 -0400 (0:00:00.313) 0:14:04.469 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Monday 16 June 2025 17:20:58 -0400 (0:00:00.337) 0:14:04.807 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Monday 16 June 2025 17:20:58 -0400 (0:00:00.219) 0:14:05.026 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Monday 16 June 2025 17:20:58 -0400 (0:00:00.255) 0:14:05.281 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Monday 16 June 2025 17:20:58 -0400 (0:00:00.272) 0:14:05.553 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Monday 16 June 2025 17:20:59 -0400 (0:00:00.264) 0:14:05.818 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Monday 16 June 2025 17:20:59 -0400 (0:00:00.200) 0:14:06.018 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Monday 16 June 2025 17:20:59 -0400 (0:00:00.252) 0:14:06.270 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Monday 16 June 2025 17:20:59 -0400 (0:00:00.235) 0:14:06.506 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Monday 16 June 2025 17:21:00 -0400 (0:00:00.218) 0:14:06.724 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Monday 16 June 2025 17:21:00 -0400 (0:00:00.288) 0:14:07.012 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Monday 16 June 2025 17:21:00 -0400 (0:00:00.278) 0:14:07.291 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Monday 16 June 2025 17:21:00 -0400 (0:00:00.277) 0:14:07.569 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Monday 16 June 2025 17:21:01 -0400 (0:00:00.319) 0:14:07.889 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Monday 16 June 2025 17:21:01 -0400 (0:00:00.271) 0:14:08.160 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Monday 16 June 2025 17:21:01 -0400 (0:00:00.188) 0:14:08.348 *********** ok: [managed-node11] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Monday 16 June 2025 17:21:01 -0400 (0:00:00.237) 0:14:08.586 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Monday 16 June 2025 17:21:02 -0400 (0:00:00.176) 0:14:08.763 *********** skipping: [managed-node11] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Monday 16 June 2025 17:21:02 -0400 (0:00:00.174) 0:14:08.937 *********** skipping: [managed-node11] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Monday 16 June 2025 17:21:02 -0400 (0:00:00.280) 0:14:09.217 *********** skipping: [managed-node11] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Monday 16 June 2025 17:21:02 -0400 (0:00:00.215) 0:14:09.433 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Monday 16 June 2025 17:21:03 -0400 (0:00:00.251) 0:14:09.684 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Monday 16 June 2025 17:21:03 -0400 (0:00:00.272) 0:14:09.957 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Monday 16 June 2025 17:21:03 -0400 (0:00:00.248) 0:14:10.205 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Monday 16 June 2025 17:21:03 -0400 (0:00:00.249) 0:14:10.455 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Monday 16 June 2025 17:21:04 -0400 (0:00:00.262) 0:14:10.718 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Monday 16 June 2025 17:21:04 -0400 (0:00:00.196) 0:14:10.915 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Monday 16 June 2025 17:21:04 -0400 (0:00:00.303) 0:14:11.218 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Monday 16 June 2025 17:21:04 -0400 (0:00:00.201) 0:14:11.420 *********** skipping: [managed-node11] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Monday 16 June 2025 17:21:04 -0400 (0:00:00.169) 0:14:11.589 *********** skipping: [managed-node11] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Monday 16 June 2025 17:21:05 -0400 (0:00:00.307) 0:14:11.897 *********** skipping: [managed-node11] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Monday 16 June 2025 17:21:05 -0400 (0:00:00.135) 0:14:12.033 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Monday 16 June 2025 17:21:05 -0400 (0:00:00.256) 0:14:12.289 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Monday 16 June 2025 17:21:05 -0400 (0:00:00.321) 0:14:12.610 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Monday 16 June 2025 17:21:06 -0400 (0:00:00.265) 0:14:12.876 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Monday 16 June 2025 17:21:06 -0400 (0:00:00.174) 0:14:13.050 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Monday 16 June 2025 17:21:06 -0400 (0:00:00.235) 0:14:13.285 *********** ok: [managed-node11] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Monday 16 June 2025 17:21:06 -0400 (0:00:00.318) 0:14:13.604 *********** ok: [managed-node11] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Monday 16 June 2025 17:21:07 -0400 (0:00:00.364) 0:14:13.969 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Monday 16 June 2025 17:21:07 -0400 (0:00:00.282) 0:14:14.251 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Monday 16 June 2025 17:21:07 -0400 (0:00:00.284) 0:14:14.536 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Monday 16 June 2025 17:21:08 -0400 (0:00:00.804) 0:14:15.341 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Monday 16 June 2025 17:21:09 -0400 (0:00:00.306) 0:14:15.648 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Monday 16 June 2025 17:21:09 -0400 (0:00:00.248) 0:14:15.896 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Monday 16 June 2025 17:21:09 -0400 (0:00:00.286) 0:14:16.182 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Monday 16 June 2025 17:21:09 -0400 (0:00:00.213) 0:14:16.395 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Monday 16 June 2025 17:21:09 -0400 (0:00:00.181) 0:14:16.577 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Monday 16 June 2025 17:21:10 -0400 (0:00:00.251) 0:14:16.829 *********** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Monday 16 June 2025 17:21:10 -0400 (0:00:00.172) 0:14:17.001 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Remove the key file] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:358 Monday 16 June 2025 17:21:10 -0400 (0:00:00.263) 0:14:17.264 *********** ok: [managed-node11] => { "changed": false, "path": "/tmp/storage_testr1vo4i7mlukskey", "state": "absent" } TASK [Test for correct handling of new encrypted volume w/ no key] ************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:368 Monday 16 June 2025 17:21:12 -0400 (0:00:01.756) 0:14:19.020 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node11 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Monday 16 June 2025 17:21:12 -0400 (0:00:00.339) 0:14:19.360 *********** ok: [managed-node11] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Monday 16 June 2025 17:21:13 -0400 (0:00:00.279) 0:14:19.639 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 16 June 2025 17:21:13 -0400 (0:00:00.393) 0:14:20.033 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 16 June 2025 17:21:13 -0400 (0:00:00.394) 0:14:20.427 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 16 June 2025 17:21:14 -0400 (0:00:00.348) 0:14:20.775 *********** skipping: [managed-node11] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node11] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Monday 16 June 2025 17:21:14 -0400 (0:00:00.601) 0:14:21.377 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Monday 16 June 2025 17:21:15 -0400 (0:00:00.393) 0:14:21.770 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 16 June 2025 17:21:15 -0400 (0:00:00.284) 0:14:22.055 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 16 June 2025 17:21:15 -0400 (0:00:00.313) 0:14:22.368 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 16 June 2025 17:21:16 -0400 (0:00:00.280) 0:14:22.648 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Monday 16 June 2025 17:21:16 -0400 (0:00:00.498) 0:14:23.147 *********** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 16 June 2025 17:21:21 -0400 (0:00:05.130) 0:14:28.277 *********** ok: [managed-node11] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_luks_version": "luks2", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Monday 16 June 2025 17:21:22 -0400 (0:00:00.382) 0:14:28.660 *********** ok: [managed-node11] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Monday 16 June 2025 17:21:22 -0400 (0:00:00.279) 0:14:28.940 *********** ok: [managed-node11] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 16 June 2025 17:21:27 -0400 (0:00:05.499) 0:14:34.440 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 16 June 2025 17:21:28 -0400 (0:00:00.474) 0:14:34.915 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 16 June 2025 17:21:28 -0400 (0:00:00.292) 0:14:35.207 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 16 June 2025 17:21:28 -0400 (0:00:00.291) 0:14:35.499 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Monday 16 June 2025 17:21:29 -0400 (0:00:00.211) 0:14:35.710 *********** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Monday 16 June 2025 17:21:34 -0400 (0:00:04.934) 0:14:40.645 *********** ok: [managed-node11] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Monday 16 June 2025 17:21:37 -0400 (0:00:03.181) 0:14:43.826 *********** ok: [managed-node11] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Monday 16 June 2025 17:21:37 -0400 (0:00:00.426) 0:14:44.252 *********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Monday 16 June 2025 17:21:37 -0400 (0:00:00.265) 0:14:44.517 *********** fatal: [managed-node11]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'test1' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Monday 16 June 2025 17:21:43 -0400 (0:00:05.894) 0:14:50.412 *********** fatal: [managed-node11]: FAILED! => { "changed": false } MSG: {'msg': "encrypted volume 'test1' missing key/password", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': False, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 16 June 2025 17:21:44 -0400 (0:00:00.237) 0:14:50.649 *********** TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Monday 16 June 2025 17:21:44 -0400 (0:00:00.131) 0:14:50.781 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Monday 16 June 2025 17:21:44 -0400 (0:00:00.269) 0:14:51.050 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Monday 16 June 2025 17:21:44 -0400 (0:00:00.318) 0:14:51.369 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create an encrypted lvm volume w/ default fs] **************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:387 Monday 16 June 2025 17:21:45 -0400 (0:00:00.299) 0:14:51.669 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 16 June 2025 17:21:45 -0400 (0:00:00.472) 0:14:52.142 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 16 June 2025 17:21:45 -0400 (0:00:00.306) 0:14:52.448 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 16 June 2025 17:21:46 -0400 (0:00:00.386) 0:14:52.835 *********** skipping: [managed-node11] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node11] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Monday 16 June 2025 17:21:46 -0400 (0:00:00.503) 0:14:53.338 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Monday 16 June 2025 17:21:46 -0400 (0:00:00.116) 0:14:53.455 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 16 June 2025 17:21:46 -0400 (0:00:00.111) 0:14:53.566 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 16 June 2025 17:21:47 -0400 (0:00:00.179) 0:14:53.745 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 16 June 2025 17:21:47 -0400 (0:00:00.212) 0:14:53.958 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Monday 16 June 2025 17:21:47 -0400 (0:00:00.428) 0:14:54.386 *********** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 16 June 2025 17:21:52 -0400 (0:00:04.325) 0:14:58.712 *********** ok: [managed-node11] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key_size": 512, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Monday 16 June 2025 17:21:52 -0400 (0:00:00.355) 0:14:59.067 *********** ok: [managed-node11] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Monday 16 June 2025 17:21:52 -0400 (0:00:00.331) 0:14:59.398 *********** ok: [managed-node11] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 16 June 2025 17:21:58 -0400 (0:00:05.449) 0:15:04.848 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 16 June 2025 17:21:58 -0400 (0:00:00.529) 0:15:05.377 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 16 June 2025 17:21:59 -0400 (0:00:00.263) 0:15:05.641 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 16 June 2025 17:21:59 -0400 (0:00:00.219) 0:15:05.860 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Monday 16 June 2025 17:21:59 -0400 (0:00:00.173) 0:15:06.033 *********** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Monday 16 June 2025 17:22:04 -0400 (0:00:04.836) 0:15:10.870 *********** ok: [managed-node11] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Monday 16 June 2025 17:22:07 -0400 (0:00:02.918) 0:15:13.788 *********** ok: [managed-node11] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Monday 16 June 2025 17:22:07 -0400 (0:00:00.398) 0:15:14.187 *********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Monday 16 June 2025 17:22:07 -0400 (0:00:00.193) 0:15:14.380 *********** changed: [managed-node11] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-a469acf8-6556-4ce0-8cc9-0fe0015186de", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-a469acf8-6556-4ce0-8cc9-0fe0015186de", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-a469acf8-6556-4ce0-8cc9-0fe0015186de", "password": "-", "state": "absent" }, { "backing_device": "/dev/mapper/foo-test1", "name": "luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-a469acf8-6556-4ce0-8cc9-0fe0015186de", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0", "state": "mounted" } ], "packages": [ "lvm2", "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Monday 16 June 2025 17:22:22 -0400 (0:00:14.529) 0:15:28.910 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Monday 16 June 2025 17:22:22 -0400 (0:00:00.206) 0:15:29.116 *********** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1750108807.494338, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "cb168d247c7c62567f4de51271ab6cb17bb488f5", "ctime": 1750108807.490338, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 364904586, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1750108807.490338, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "4217545767", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Monday 16 June 2025 17:22:23 -0400 (0:00:01.157) 0:15:30.274 *********** ok: [managed-node11] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 16 June 2025 17:22:25 -0400 (0:00:01.505) 0:15:31.779 *********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Monday 16 June 2025 17:22:25 -0400 (0:00:00.175) 0:15:31.955 *********** ok: [managed-node11] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-a469acf8-6556-4ce0-8cc9-0fe0015186de", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-a469acf8-6556-4ce0-8cc9-0fe0015186de", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-a469acf8-6556-4ce0-8cc9-0fe0015186de", "password": "-", "state": "absent" }, { "backing_device": "/dev/mapper/foo-test1", "name": "luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-a469acf8-6556-4ce0-8cc9-0fe0015186de", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0", "state": "mounted" } ], "packages": [ "lvm2", "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Monday 16 June 2025 17:22:25 -0400 (0:00:00.227) 0:15:32.183 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Monday 16 June 2025 17:22:25 -0400 (0:00:00.291) 0:15:32.475 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Monday 16 June 2025 17:22:26 -0400 (0:00:00.288) 0:15:32.763 *********** changed: [managed-node11] => (item={'src': '/dev/mapper/luks-a469acf8-6556-4ce0-8cc9-0fe0015186de', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-a469acf8-6556-4ce0-8cc9-0fe0015186de", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-a469acf8-6556-4ce0-8cc9-0fe0015186de" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Monday 16 June 2025 17:22:27 -0400 (0:00:01.843) 0:15:34.606 *********** ok: [managed-node11] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Monday 16 June 2025 17:22:30 -0400 (0:00:02.100) 0:15:36.707 *********** changed: [managed-node11] => (item={'src': '/dev/mapper/luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Monday 16 June 2025 17:22:31 -0400 (0:00:01.662) 0:15:38.370 *********** skipping: [managed-node11] => (item={'src': '/dev/mapper/luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Monday 16 June 2025 17:22:32 -0400 (0:00:00.265) 0:15:38.636 *********** ok: [managed-node11] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Monday 16 June 2025 17:22:33 -0400 (0:00:01.958) 0:15:40.594 *********** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1750108820.6184433, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "89d5e888e469d991603dd183c2fbd23bf00f68d2", "ctime": 1750108812.7703803, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 350224521, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1750108812.7693803, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 88, "uid": 0, "version": "2580630657", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Monday 16 June 2025 17:22:35 -0400 (0:00:01.330) 0:15:41.925 *********** changed: [managed-node11] => (item={'backing_device': '/dev/sda1', 'name': 'luks-a469acf8-6556-4ce0-8cc9-0fe0015186de', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-a469acf8-6556-4ce0-8cc9-0fe0015186de", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed changed: [managed-node11] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Monday 16 June 2025 17:22:38 -0400 (0:00:03.119) 0:15:45.044 *********** ok: [managed-node11] TASK [Verify role results] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:406 Monday 16 June 2025 17:22:40 -0400 (0:00:02.081) 0:15:47.126 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node11 TASK [Print out pool information] ********************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Monday 16 June 2025 17:22:41 -0400 (0:00:00.524) 0:15:47.650 *********** ok: [managed-node11] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Monday 16 June 2025 17:22:41 -0400 (0:00:00.283) 0:15:47.934 *********** skipping: [managed-node11] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Monday 16 June 2025 17:22:41 -0400 (0:00:00.269) 0:15:48.203 *********** ok: [managed-node11] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0" }, "/dev/mapper/luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0", "size": "4G", "type": "crypt", "uuid": "fbd209dd-5a9e-4907-bf76-88e66381524b" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "al0uCf-R2qd-eaR5-lcbe-8d4g-1JeN-2yYBO1" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Monday 16 June 2025 17:22:43 -0400 (0:00:01.670) 0:15:49.873 *********** ok: [managed-node11] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002643", "end": "2025-06-16 17:22:44.465956", "rc": 0, "start": "2025-06-16 17:22:44.463313" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Monday 16 June 2025 17:22:44 -0400 (0:00:01.606) 0:15:51.480 *********** ok: [managed-node11] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002681", "end": "2025-06-16 17:22:46.249824", "failed_when_result": false, "rc": 0, "start": "2025-06-16 17:22:46.247143" } STDOUT: luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0 /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Monday 16 June 2025 17:22:46 -0400 (0:00:01.773) 0:15:53.253 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node11 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Monday 16 June 2025 17:22:47 -0400 (0:00:00.526) 0:15:53.780 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Monday 16 June 2025 17:22:47 -0400 (0:00:00.225) 0:15:54.005 *********** ok: [managed-node11] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.024298", "end": "2025-06-16 17:22:48.743348", "rc": 0, "start": "2025-06-16 17:22:48.719050" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Monday 16 June 2025 17:22:49 -0400 (0:00:01.709) 0:15:55.715 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Monday 16 June 2025 17:22:49 -0400 (0:00:00.420) 0:15:56.135 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node11 TASK [Set test variables] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Monday 16 June 2025 17:22:50 -0400 (0:00:00.560) 0:15:56.696 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Monday 16 June 2025 17:22:50 -0400 (0:00:00.668) 0:15:57.365 *********** ok: [managed-node11] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Monday 16 June 2025 17:22:53 -0400 (0:00:02.360) 0:15:59.725 *********** ok: [managed-node11] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Monday 16 June 2025 17:22:53 -0400 (0:00:00.328) 0:16:00.053 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Monday 16 June 2025 17:22:53 -0400 (0:00:00.323) 0:16:00.377 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Monday 16 June 2025 17:22:54 -0400 (0:00:00.380) 0:16:00.758 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Monday 16 June 2025 17:22:54 -0400 (0:00:00.244) 0:16:01.002 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Monday 16 June 2025 17:22:54 -0400 (0:00:00.309) 0:16:01.312 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:54 Monday 16 June 2025 17:22:54 -0400 (0:00:00.268) 0:16:01.581 *********** ok: [managed-node11] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:67 Monday 16 June 2025 17:22:55 -0400 (0:00:00.421) 0:16:02.003 *********** ok: [managed-node11] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.14.162 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:77 Monday 16 June 2025 17:22:56 -0400 (0:00:01.626) 0:16:03.629 *********** skipping: [managed-node11] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:87 Monday 16 June 2025 17:22:57 -0400 (0:00:00.299) 0:16:03.928 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node11 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Monday 16 June 2025 17:22:57 -0400 (0:00:00.396) 0:16:04.325 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Monday 16 June 2025 17:22:57 -0400 (0:00:00.144) 0:16:04.469 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Monday 16 June 2025 17:22:58 -0400 (0:00:00.285) 0:16:04.755 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Monday 16 June 2025 17:22:58 -0400 (0:00:00.225) 0:16:04.981 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Monday 16 June 2025 17:22:58 -0400 (0:00:00.164) 0:16:05.145 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Monday 16 June 2025 17:22:58 -0400 (0:00:00.227) 0:16:05.373 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Monday 16 June 2025 17:22:58 -0400 (0:00:00.200) 0:16:05.573 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Monday 16 June 2025 17:22:59 -0400 (0:00:00.216) 0:16:05.790 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Monday 16 June 2025 17:22:59 -0400 (0:00:00.242) 0:16:06.032 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Monday 16 June 2025 17:22:59 -0400 (0:00:00.241) 0:16:06.274 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Monday 16 June 2025 17:22:59 -0400 (0:00:00.294) 0:16:06.568 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:90 Monday 16 June 2025 17:23:00 -0400 (0:00:00.220) 0:16:06.789 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node11 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Monday 16 June 2025 17:23:00 -0400 (0:00:00.565) 0:16:07.354 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node11 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Monday 16 June 2025 17:23:01 -0400 (0:00:00.554) 0:16:07.908 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Monday 16 June 2025 17:23:01 -0400 (0:00:00.199) 0:16:08.107 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Monday 16 June 2025 17:23:01 -0400 (0:00:00.331) 0:16:08.439 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Monday 16 June 2025 17:23:02 -0400 (0:00:00.252) 0:16:08.691 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Monday 16 June 2025 17:23:02 -0400 (0:00:00.309) 0:16:09.001 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Monday 16 June 2025 17:23:02 -0400 (0:00:00.293) 0:16:09.294 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Monday 16 June 2025 17:23:02 -0400 (0:00:00.259) 0:16:09.553 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:93 Monday 16 June 2025 17:23:03 -0400 (0:00:00.314) 0:16:09.868 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node11 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Monday 16 June 2025 17:23:03 -0400 (0:00:00.700) 0:16:10.569 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node11 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Monday 16 June 2025 17:23:04 -0400 (0:00:00.459) 0:16:11.028 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Monday 16 June 2025 17:23:04 -0400 (0:00:00.399) 0:16:11.428 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Monday 16 June 2025 17:23:05 -0400 (0:00:00.747) 0:16:12.176 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Monday 16 June 2025 17:23:05 -0400 (0:00:00.195) 0:16:12.371 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:96 Monday 16 June 2025 17:23:05 -0400 (0:00:00.210) 0:16:12.582 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node11 TASK [Set test variables] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Monday 16 June 2025 17:23:06 -0400 (0:00:00.606) 0:16:13.188 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Monday 16 June 2025 17:23:06 -0400 (0:00:00.346) 0:16:13.535 *********** skipping: [managed-node11] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Monday 16 June 2025 17:23:07 -0400 (0:00:00.295) 0:16:13.830 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node11 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Monday 16 June 2025 17:23:07 -0400 (0:00:00.595) 0:16:14.426 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Monday 16 June 2025 17:23:08 -0400 (0:00:00.312) 0:16:14.738 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Monday 16 June 2025 17:23:08 -0400 (0:00:00.286) 0:16:15.025 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Monday 16 June 2025 17:23:08 -0400 (0:00:00.353) 0:16:15.379 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Monday 16 June 2025 17:23:09 -0400 (0:00:00.285) 0:16:15.665 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Monday 16 June 2025 17:23:09 -0400 (0:00:00.232) 0:16:15.897 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Monday 16 June 2025 17:23:09 -0400 (0:00:00.275) 0:16:16.173 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:99 Monday 16 June 2025 17:23:09 -0400 (0:00:00.176) 0:16:16.350 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node11 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Monday 16 June 2025 17:23:10 -0400 (0:00:00.489) 0:16:16.839 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node11 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Monday 16 June 2025 17:23:10 -0400 (0:00:00.427) 0:16:17.267 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Monday 16 June 2025 17:23:10 -0400 (0:00:00.243) 0:16:17.511 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Monday 16 June 2025 17:23:11 -0400 (0:00:00.313) 0:16:17.824 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Monday 16 June 2025 17:23:11 -0400 (0:00:00.229) 0:16:18.053 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Monday 16 June 2025 17:23:11 -0400 (0:00:00.242) 0:16:18.296 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Monday 16 June 2025 17:23:11 -0400 (0:00:00.302) 0:16:18.598 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Monday 16 June 2025 17:23:12 -0400 (0:00:00.302) 0:16:18.901 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:102 Monday 16 June 2025 17:23:12 -0400 (0:00:00.167) 0:16:19.069 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node11 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Monday 16 June 2025 17:23:12 -0400 (0:00:00.558) 0:16:19.627 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Monday 16 June 2025 17:23:13 -0400 (0:00:00.298) 0:16:19.926 *********** skipping: [managed-node11] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Monday 16 June 2025 17:23:13 -0400 (0:00:00.335) 0:16:20.262 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Monday 16 June 2025 17:23:13 -0400 (0:00:00.334) 0:16:20.596 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Monday 16 June 2025 17:23:14 -0400 (0:00:00.323) 0:16:20.919 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Monday 16 June 2025 17:23:14 -0400 (0:00:00.279) 0:16:21.199 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Monday 16 June 2025 17:23:14 -0400 (0:00:00.224) 0:16:21.423 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:105 Monday 16 June 2025 17:23:14 -0400 (0:00:00.200) 0:16:21.624 *********** ok: [managed-node11] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Monday 16 June 2025 17:23:15 -0400 (0:00:00.246) 0:16:21.871 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node11 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Monday 16 June 2025 17:23:15 -0400 (0:00:00.460) 0:16:22.331 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Monday 16 June 2025 17:23:15 -0400 (0:00:00.287) 0:16:22.618 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node11 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Monday 16 June 2025 17:23:17 -0400 (0:00:01.396) 0:16:24.015 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Monday 16 June 2025 17:23:17 -0400 (0:00:00.310) 0:16:24.325 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Monday 16 June 2025 17:23:18 -0400 (0:00:00.794) 0:16:25.120 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Monday 16 June 2025 17:23:18 -0400 (0:00:00.329) 0:16:25.450 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Monday 16 June 2025 17:23:19 -0400 (0:00:00.261) 0:16:25.711 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Monday 16 June 2025 17:23:19 -0400 (0:00:00.283) 0:16:25.995 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Monday 16 June 2025 17:23:19 -0400 (0:00:00.212) 0:16:26.208 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Monday 16 June 2025 17:23:19 -0400 (0:00:00.228) 0:16:26.436 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Monday 16 June 2025 17:23:20 -0400 (0:00:00.326) 0:16:26.763 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Monday 16 June 2025 17:23:20 -0400 (0:00:00.200) 0:16:26.964 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Monday 16 June 2025 17:23:20 -0400 (0:00:00.297) 0:16:27.261 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Monday 16 June 2025 17:23:20 -0400 (0:00:00.250) 0:16:27.512 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Monday 16 June 2025 17:23:21 -0400 (0:00:00.687) 0:16:28.199 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Monday 16 June 2025 17:23:21 -0400 (0:00:00.305) 0:16:28.505 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Monday 16 June 2025 17:23:22 -0400 (0:00:00.268) 0:16:28.773 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Monday 16 June 2025 17:23:22 -0400 (0:00:00.206) 0:16:28.980 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Monday 16 June 2025 17:23:22 -0400 (0:00:00.216) 0:16:29.196 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Monday 16 June 2025 17:23:22 -0400 (0:00:00.150) 0:16:29.347 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Monday 16 June 2025 17:23:23 -0400 (0:00:00.331) 0:16:29.678 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Monday 16 June 2025 17:23:23 -0400 (0:00:00.317) 0:16:29.995 *********** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1750108941.6844122, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1750108941.6844122, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 230514, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1750108941.6844122, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Monday 16 June 2025 17:23:24 -0400 (0:00:01.511) 0:16:31.507 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Monday 16 June 2025 17:23:25 -0400 (0:00:00.345) 0:16:31.852 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Monday 16 June 2025 17:23:25 -0400 (0:00:00.280) 0:16:32.133 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Monday 16 June 2025 17:23:25 -0400 (0:00:00.308) 0:16:32.441 *********** ok: [managed-node11] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Monday 16 June 2025 17:23:26 -0400 (0:00:00.277) 0:16:32.719 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Monday 16 June 2025 17:23:26 -0400 (0:00:00.291) 0:16:33.010 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Monday 16 June 2025 17:23:26 -0400 (0:00:00.248) 0:16:33.259 *********** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1750108941.8294134, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1750108941.8294134, "dev": 6, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 229862, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1750108941.8294134, "nlink": 1, "path": "/dev/mapper/luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Monday 16 June 2025 17:23:27 -0400 (0:00:01.330) 0:16:34.589 *********** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Monday 16 June 2025 17:23:32 -0400 (0:00:04.713) 0:16:39.303 *********** ok: [managed-node11] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.010155", "end": "2025-06-16 17:23:33.976470", "rc": 0, "start": "2025-06-16 17:23:33.966315" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 923141 Threads: 2 Salt: e8 8f 0a de c6 6c 22 50 44 1a 8c 0a c3 c1 81 25 40 10 a6 a0 81 20 e0 f7 49 55 5e 29 b9 0d 25 00 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 120470 Salt: 76 fa e1 90 60 ed fc 3a 0f 96 77 81 96 f9 4d 3e d5 21 95 f4 b6 be d5 e6 73 c1 9a 8f 32 4c bb db Digest: d6 54 14 36 6e de 3b 3e b3 a9 e8 29 c3 9b d6 3f 90 86 f1 1f 35 d1 d1 31 14 40 64 4a 8f a5 3c aa TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Monday 16 June 2025 17:23:34 -0400 (0:00:01.589) 0:16:40.893 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Monday 16 June 2025 17:23:34 -0400 (0:00:00.313) 0:16:41.207 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Monday 16 June 2025 17:23:34 -0400 (0:00:00.280) 0:16:41.487 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Monday 16 June 2025 17:23:35 -0400 (0:00:00.340) 0:16:41.828 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Monday 16 June 2025 17:23:35 -0400 (0:00:00.297) 0:16:42.125 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Monday 16 June 2025 17:23:35 -0400 (0:00:00.420) 0:16:42.546 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Monday 16 June 2025 17:23:36 -0400 (0:00:00.448) 0:16:42.995 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Set test variables] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Monday 16 June 2025 17:23:36 -0400 (0:00:00.460) 0:16:43.455 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0 /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Monday 16 June 2025 17:23:37 -0400 (0:00:00.334) 0:16:43.790 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Monday 16 June 2025 17:23:37 -0400 (0:00:00.282) 0:16:44.073 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Monday 16 June 2025 17:23:37 -0400 (0:00:00.269) 0:16:44.342 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Monday 16 June 2025 17:23:37 -0400 (0:00:00.285) 0:16:44.628 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Monday 16 June 2025 17:23:38 -0400 (0:00:00.199) 0:16:44.827 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Monday 16 June 2025 17:23:38 -0400 (0:00:00.176) 0:16:45.004 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Monday 16 June 2025 17:23:38 -0400 (0:00:00.289) 0:16:45.293 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Monday 16 June 2025 17:23:38 -0400 (0:00:00.203) 0:16:45.497 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Monday 16 June 2025 17:23:39 -0400 (0:00:00.313) 0:16:45.810 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Monday 16 June 2025 17:23:39 -0400 (0:00:00.193) 0:16:46.004 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Monday 16 June 2025 17:23:39 -0400 (0:00:00.169) 0:16:46.174 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Monday 16 June 2025 17:23:39 -0400 (0:00:00.241) 0:16:46.415 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Monday 16 June 2025 17:23:39 -0400 (0:00:00.197) 0:16:46.613 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Monday 16 June 2025 17:23:40 -0400 (0:00:00.248) 0:16:46.862 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Monday 16 June 2025 17:23:40 -0400 (0:00:00.169) 0:16:47.031 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Monday 16 June 2025 17:23:40 -0400 (0:00:00.187) 0:16:47.218 *********** ok: [managed-node11] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Monday 16 June 2025 17:23:43 -0400 (0:00:02.751) 0:16:49.970 *********** ok: [managed-node11] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Monday 16 June 2025 17:23:44 -0400 (0:00:01.571) 0:16:51.541 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Monday 16 June 2025 17:23:45 -0400 (0:00:00.381) 0:16:51.923 *********** ok: [managed-node11] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Monday 16 June 2025 17:23:45 -0400 (0:00:00.185) 0:16:52.108 *********** ok: [managed-node11] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Monday 16 June 2025 17:23:47 -0400 (0:00:01.610) 0:16:53.719 *********** skipping: [managed-node11] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Monday 16 June 2025 17:23:47 -0400 (0:00:00.329) 0:16:54.048 *********** skipping: [managed-node11] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Monday 16 June 2025 17:23:47 -0400 (0:00:00.324) 0:16:54.373 *********** skipping: [managed-node11] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Monday 16 June 2025 17:23:48 -0400 (0:00:00.310) 0:16:54.684 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Monday 16 June 2025 17:23:48 -0400 (0:00:00.273) 0:16:54.957 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Monday 16 June 2025 17:23:48 -0400 (0:00:00.259) 0:16:55.217 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Monday 16 June 2025 17:23:48 -0400 (0:00:00.248) 0:16:55.466 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Monday 16 June 2025 17:23:49 -0400 (0:00:00.212) 0:16:55.678 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Monday 16 June 2025 17:23:49 -0400 (0:00:00.382) 0:16:56.060 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Monday 16 June 2025 17:23:49 -0400 (0:00:00.347) 0:16:56.408 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Monday 16 June 2025 17:23:50 -0400 (0:00:00.315) 0:16:56.724 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Monday 16 June 2025 17:23:50 -0400 (0:00:00.338) 0:16:57.062 *********** skipping: [managed-node11] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Monday 16 June 2025 17:23:50 -0400 (0:00:00.277) 0:16:57.340 *********** skipping: [managed-node11] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Monday 16 June 2025 17:23:51 -0400 (0:00:00.314) 0:16:57.655 *********** skipping: [managed-node11] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Monday 16 June 2025 17:23:51 -0400 (0:00:00.274) 0:16:57.929 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Monday 16 June 2025 17:23:51 -0400 (0:00:00.253) 0:16:58.182 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Monday 16 June 2025 17:23:51 -0400 (0:00:00.202) 0:16:58.384 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Monday 16 June 2025 17:23:51 -0400 (0:00:00.189) 0:16:58.573 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Monday 16 June 2025 17:23:52 -0400 (0:00:00.186) 0:16:58.760 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Monday 16 June 2025 17:23:52 -0400 (0:00:00.271) 0:16:59.032 *********** ok: [managed-node11] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Monday 16 June 2025 17:23:52 -0400 (0:00:00.424) 0:16:59.457 *********** ok: [managed-node11] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Monday 16 June 2025 17:23:53 -0400 (0:00:00.241) 0:16:59.699 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Monday 16 June 2025 17:23:53 -0400 (0:00:00.393) 0:17:00.092 *********** ok: [managed-node11] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.024327", "end": "2025-06-16 17:23:54.637761", "rc": 0, "start": "2025-06-16 17:23:54.613434" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Monday 16 June 2025 17:23:54 -0400 (0:00:01.517) 0:17:01.610 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Monday 16 June 2025 17:23:55 -0400 (0:00:00.255) 0:17:01.866 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Monday 16 June 2025 17:23:55 -0400 (0:00:00.369) 0:17:02.236 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Monday 16 June 2025 17:23:55 -0400 (0:00:00.303) 0:17:02.539 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Monday 16 June 2025 17:23:56 -0400 (0:00:00.284) 0:17:02.823 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Monday 16 June 2025 17:23:56 -0400 (0:00:00.307) 0:17:03.131 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Monday 16 June 2025 17:23:56 -0400 (0:00:00.280) 0:17:03.412 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Monday 16 June 2025 17:23:57 -0400 (0:00:00.243) 0:17:03.655 *********** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Monday 16 June 2025 17:23:57 -0400 (0:00:00.250) 0:17:03.905 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Verify preservation of encryption settings on existing LVM volume] ******* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:409 Monday 16 June 2025 17:23:57 -0400 (0:00:00.326) 0:17:04.232 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 16 June 2025 17:23:58 -0400 (0:00:00.693) 0:17:04.925 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 16 June 2025 17:23:58 -0400 (0:00:00.416) 0:17:05.342 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 16 June 2025 17:23:59 -0400 (0:00:00.369) 0:17:05.712 *********** skipping: [managed-node11] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node11] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Monday 16 June 2025 17:23:59 -0400 (0:00:00.580) 0:17:06.292 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Monday 16 June 2025 17:24:00 -0400 (0:00:00.339) 0:17:06.631 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 16 June 2025 17:24:00 -0400 (0:00:00.325) 0:17:06.957 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 16 June 2025 17:24:00 -0400 (0:00:00.371) 0:17:07.329 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 16 June 2025 17:24:00 -0400 (0:00:00.167) 0:17:07.496 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Monday 16 June 2025 17:24:01 -0400 (0:00:00.587) 0:17:08.083 *********** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 16 June 2025 17:24:06 -0400 (0:00:05.520) 0:17:13.604 *********** ok: [managed-node11] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Monday 16 June 2025 17:24:07 -0400 (0:00:00.239) 0:17:13.843 *********** ok: [managed-node11] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Monday 16 June 2025 17:24:07 -0400 (0:00:00.251) 0:17:14.095 *********** ok: [managed-node11] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 16 June 2025 17:24:13 -0400 (0:00:05.570) 0:17:19.666 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 16 June 2025 17:24:13 -0400 (0:00:00.457) 0:17:20.124 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 16 June 2025 17:24:13 -0400 (0:00:00.271) 0:17:20.395 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 16 June 2025 17:24:14 -0400 (0:00:00.286) 0:17:20.681 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Monday 16 June 2025 17:24:14 -0400 (0:00:00.187) 0:17:20.869 *********** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Monday 16 June 2025 17:24:19 -0400 (0:00:04.808) 0:17:25.677 *********** ok: [managed-node11] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...d6556\\x2d4ce0\\x2d8cc9\\x2d0fe0015186de.service": { "name": "systemd-cryptsetup@luk...d6556\\x2d4ce0\\x2d8cc9\\x2d0fe0015186de.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2da469acf8\\x2d6556\\x2d4ce0\\x2d8cc9\\x2d0fe0015186de.service": { "name": "systemd-cryptsetup@luks\\x2da469acf8\\x2d6556\\x2d4ce0\\x2d8cc9\\x2d0fe0015186de.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Monday 16 June 2025 17:24:22 -0400 (0:00:03.089) 0:17:28.766 *********** ok: [managed-node11] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2da469acf8\\x2d6556\\x2d4ce0\\x2d8cc9\\x2d0fe0015186de.service", "systemd-cryptsetup@luk...d6556\\x2d4ce0\\x2d8cc9\\x2d0fe0015186de.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Monday 16 June 2025 17:24:22 -0400 (0:00:00.346) 0:17:29.113 *********** changed: [managed-node11] => (item=systemd-cryptsetup@luks\x2da469acf8\x2d6556\x2d4ce0\x2d8cc9\x2d0fe0015186de.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2da469acf8\\x2d6556\\x2d4ce0\\x2d8cc9\\x2d0fe0015186de.service", "name": "systemd-cryptsetup@luks\\x2da469acf8\\x2d6556\\x2d4ce0\\x2d8cc9\\x2d0fe0015186de.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "dev-sda1.device system-systemd\\x2dcryptsetup.slice cryptsetup-pre.target systemd-journald.socket", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-a469acf8-6556-4ce0-8cc9-0fe0015186de", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-a469acf8-6556-4ce0-8cc9-0fe0015186de /dev/sda1 VALUE_SPECIFIED_IN_NO_LOG_PARAMETER ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-a469acf8-6556-4ce0-8cc9-0fe0015186de ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2da469acf8\\x2d6556\\x2d4ce0\\x2d8cc9\\x2d0fe0015186de.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2da469acf8\\x2d6556\\x2d4ce0\\x2d8cc9\\x2d0fe0015186de.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2da469acf8\\x2d6556\\x2d4ce0\\x2d8cc9\\x2d0fe0015186de.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Mon 2025-06-16 17:22:33 EDT", "StateChangeTimestampMonotonic": "2589568029", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node11] => (item=systemd-cryptsetup@luk...d6556\x2d4ce0\x2d8cc9\x2d0fe0015186de.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d6556\\x2d4ce0\\x2d8cc9\\x2d0fe0015186de.service", "name": "systemd-cryptsetup@luk...d6556\\x2d4ce0\\x2d8cc9\\x2d0fe0015186de.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d6556\\x2d4ce0\\x2d8cc9\\x2d0fe0015186de.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d6556\\x2d4ce0\\x2d8cc9\\x2d0fe0015186de.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...d6556\\x2d4ce0\\x2d8cc9\\x2d0fe0015186de.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d6556\\x2d4ce0\\x2d8cc9\\x2d0fe0015186de.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Monday 16 June 2025 17:24:26 -0400 (0:00:03.969) 0:17:33.082 *********** ok: [managed-node11] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0", "state": "mounted" } ], "packages": [ "lvm2", "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Monday 16 June 2025 17:24:32 -0400 (0:00:05.826) 0:17:38.909 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Monday 16 June 2025 17:24:32 -0400 (0:00:00.175) 0:17:39.085 *********** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1750108951.4554906, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "af13c3a106572cc405a934ce35bd466146b20c74", "ctime": 1750108951.4524906, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 364904586, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1750108951.4524906, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "4217545767", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Monday 16 June 2025 17:24:33 -0400 (0:00:01.483) 0:17:40.569 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 16 June 2025 17:24:34 -0400 (0:00:00.216) 0:17:40.785 *********** changed: [managed-node11] => (item=systemd-cryptsetup@luks\x2da469acf8\x2d6556\x2d4ce0\x2d8cc9\x2d0fe0015186de.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2da469acf8\\x2d6556\\x2d4ce0\\x2d8cc9\\x2d0fe0015186de.service", "name": "systemd-cryptsetup@luks\\x2da469acf8\\x2d6556\\x2d4ce0\\x2d8cc9\\x2d0fe0015186de.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2da469acf8\\x2d6556\\x2d4ce0\\x2d8cc9\\x2d0fe0015186de.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2da469acf8\\x2d6556\\x2d4ce0\\x2d8cc9\\x2d0fe0015186de.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2da469acf8\\x2d6556\\x2d4ce0\\x2d8cc9\\x2d0fe0015186de.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2da469acf8\\x2d6556\\x2d4ce0\\x2d8cc9\\x2d0fe0015186de.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node11] => (item=systemd-cryptsetup@luk...d6556\x2d4ce0\x2d8cc9\x2d0fe0015186de.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d6556\\x2d4ce0\\x2d8cc9\\x2d0fe0015186de.service", "name": "systemd-cryptsetup@luk...d6556\\x2d4ce0\\x2d8cc9\\x2d0fe0015186de.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d6556\\x2d4ce0\\x2d8cc9\\x2d0fe0015186de.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d6556\\x2d4ce0\\x2d8cc9\\x2d0fe0015186de.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...d6556\\x2d4ce0\\x2d8cc9\\x2d0fe0015186de.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d6556\\x2d4ce0\\x2d8cc9\\x2d0fe0015186de.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Monday 16 June 2025 17:24:37 -0400 (0:00:03.684) 0:17:44.470 *********** ok: [managed-node11] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0", "state": "mounted" } ], "packages": [ "lvm2", "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Monday 16 June 2025 17:24:38 -0400 (0:00:00.198) 0:17:44.669 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Monday 16 June 2025 17:24:38 -0400 (0:00:00.245) 0:17:44.915 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Monday 16 June 2025 17:24:38 -0400 (0:00:00.219) 0:17:45.134 *********** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Monday 16 June 2025 17:24:38 -0400 (0:00:00.210) 0:17:45.345 *********** ok: [managed-node11] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Monday 16 June 2025 17:24:40 -0400 (0:00:01.967) 0:17:47.312 *********** ok: [managed-node11] => (item={'src': '/dev/mapper/luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Monday 16 June 2025 17:24:42 -0400 (0:00:01.708) 0:17:49.021 *********** skipping: [managed-node11] => (item={'src': '/dev/mapper/luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Monday 16 June 2025 17:24:42 -0400 (0:00:00.404) 0:17:49.425 *********** ok: [managed-node11] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Monday 16 June 2025 17:24:45 -0400 (0:00:02.565) 0:17:51.990 *********** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1750108966.248609, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "692f28302299cbc7919c026232b557c294a53d1d", "ctime": 1750108958.1075437, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 471859331, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1750108958.1065438, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "2312515694", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Monday 16 June 2025 17:24:47 -0400 (0:00:01.690) 0:17:53.681 *********** TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Monday 16 June 2025 17:24:47 -0400 (0:00:00.235) 0:17:53.917 *********** ok: [managed-node11] TASK [Assert preservation of encryption settings on existing LVM volume] ******* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:423 Monday 16 June 2025 17:24:49 -0400 (0:00:02.042) 0:17:55.959 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify role results] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:430 Monday 16 June 2025 17:24:49 -0400 (0:00:00.456) 0:17:56.415 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node11 TASK [Print out pool information] ********************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Monday 16 June 2025 17:24:50 -0400 (0:00:00.388) 0:17:56.803 *********** ok: [managed-node11] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Monday 16 June 2025 17:24:50 -0400 (0:00:00.423) 0:17:57.227 *********** skipping: [managed-node11] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Monday 16 June 2025 17:24:50 -0400 (0:00:00.201) 0:17:57.428 *********** ok: [managed-node11] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0" }, "/dev/mapper/luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0", "size": "4G", "type": "crypt", "uuid": "fbd209dd-5a9e-4907-bf76-88e66381524b" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "al0uCf-R2qd-eaR5-lcbe-8d4g-1JeN-2yYBO1" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Monday 16 June 2025 17:24:52 -0400 (0:00:01.588) 0:17:59.016 *********** ok: [managed-node11] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002675", "end": "2025-06-16 17:24:53.629590", "rc": 0, "start": "2025-06-16 17:24:53.626915" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Monday 16 June 2025 17:24:53 -0400 (0:00:01.514) 0:18:00.531 *********** ok: [managed-node11] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002660", "end": "2025-06-16 17:24:55.158338", "failed_when_result": false, "rc": 0, "start": "2025-06-16 17:24:55.155678" } STDOUT: luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0 /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Monday 16 June 2025 17:24:55 -0400 (0:00:01.521) 0:18:02.052 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node11 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Monday 16 June 2025 17:24:56 -0400 (0:00:00.582) 0:18:02.634 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Monday 16 June 2025 17:24:56 -0400 (0:00:00.235) 0:18:02.870 *********** ok: [managed-node11] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.025360", "end": "2025-06-16 17:24:57.670955", "rc": 0, "start": "2025-06-16 17:24:57.645595" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Monday 16 June 2025 17:24:58 -0400 (0:00:01.796) 0:18:04.667 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Monday 16 June 2025 17:24:58 -0400 (0:00:00.423) 0:18:05.090 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node11 TASK [Set test variables] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Monday 16 June 2025 17:24:59 -0400 (0:00:00.584) 0:18:05.675 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Monday 16 June 2025 17:24:59 -0400 (0:00:00.454) 0:18:06.129 *********** ok: [managed-node11] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Monday 16 June 2025 17:25:01 -0400 (0:00:01.850) 0:18:07.979 *********** ok: [managed-node11] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Monday 16 June 2025 17:25:01 -0400 (0:00:00.359) 0:18:08.338 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Monday 16 June 2025 17:25:01 -0400 (0:00:00.280) 0:18:08.619 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Monday 16 June 2025 17:25:02 -0400 (0:00:00.357) 0:18:08.977 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Monday 16 June 2025 17:25:02 -0400 (0:00:00.281) 0:18:09.258 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Monday 16 June 2025 17:25:03 -0400 (0:00:00.378) 0:18:09.637 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:54 Monday 16 June 2025 17:25:03 -0400 (0:00:00.220) 0:18:09.857 *********** ok: [managed-node11] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:67 Monday 16 June 2025 17:25:03 -0400 (0:00:00.377) 0:18:10.234 *********** ok: [managed-node11] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.14.162 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:77 Monday 16 June 2025 17:25:05 -0400 (0:00:01.645) 0:18:11.880 *********** skipping: [managed-node11] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:87 Monday 16 June 2025 17:25:05 -0400 (0:00:00.382) 0:18:12.263 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node11 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Monday 16 June 2025 17:25:06 -0400 (0:00:00.689) 0:18:12.953 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Monday 16 June 2025 17:25:06 -0400 (0:00:00.287) 0:18:13.240 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Monday 16 June 2025 17:25:06 -0400 (0:00:00.272) 0:18:13.513 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Monday 16 June 2025 17:25:07 -0400 (0:00:00.282) 0:18:13.795 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Monday 16 June 2025 17:25:07 -0400 (0:00:00.327) 0:18:14.123 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Monday 16 June 2025 17:25:07 -0400 (0:00:00.279) 0:18:14.403 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Monday 16 June 2025 17:25:08 -0400 (0:00:00.260) 0:18:14.664 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Monday 16 June 2025 17:25:08 -0400 (0:00:00.299) 0:18:14.964 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Monday 16 June 2025 17:25:08 -0400 (0:00:00.253) 0:18:15.217 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Monday 16 June 2025 17:25:08 -0400 (0:00:00.302) 0:18:15.520 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Monday 16 June 2025 17:25:09 -0400 (0:00:00.310) 0:18:15.830 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:90 Monday 16 June 2025 17:25:09 -0400 (0:00:00.241) 0:18:16.072 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node11 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Monday 16 June 2025 17:25:10 -0400 (0:00:00.909) 0:18:16.982 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node11 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Monday 16 June 2025 17:25:11 -0400 (0:00:00.672) 0:18:17.654 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Monday 16 June 2025 17:25:11 -0400 (0:00:00.299) 0:18:17.954 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Monday 16 June 2025 17:25:11 -0400 (0:00:00.277) 0:18:18.232 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Monday 16 June 2025 17:25:11 -0400 (0:00:00.274) 0:18:18.506 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Monday 16 June 2025 17:25:12 -0400 (0:00:00.259) 0:18:18.765 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Monday 16 June 2025 17:25:12 -0400 (0:00:00.354) 0:18:19.120 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Monday 16 June 2025 17:25:12 -0400 (0:00:00.308) 0:18:19.428 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:93 Monday 16 June 2025 17:25:13 -0400 (0:00:00.360) 0:18:19.789 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node11 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Monday 16 June 2025 17:25:13 -0400 (0:00:00.483) 0:18:20.272 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node11 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Monday 16 June 2025 17:25:14 -0400 (0:00:00.514) 0:18:20.786 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Monday 16 June 2025 17:25:14 -0400 (0:00:00.230) 0:18:21.017 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Monday 16 June 2025 17:25:14 -0400 (0:00:00.336) 0:18:21.354 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Monday 16 June 2025 17:25:14 -0400 (0:00:00.265) 0:18:21.619 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:96 Monday 16 June 2025 17:25:15 -0400 (0:00:00.250) 0:18:21.870 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node11 TASK [Set test variables] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Monday 16 June 2025 17:25:15 -0400 (0:00:00.456) 0:18:22.327 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Monday 16 June 2025 17:25:16 -0400 (0:00:00.338) 0:18:22.665 *********** skipping: [managed-node11] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Monday 16 June 2025 17:25:16 -0400 (0:00:00.299) 0:18:22.965 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node11 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Monday 16 June 2025 17:25:16 -0400 (0:00:00.330) 0:18:23.296 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Monday 16 June 2025 17:25:16 -0400 (0:00:00.283) 0:18:23.579 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Monday 16 June 2025 17:25:17 -0400 (0:00:00.239) 0:18:23.818 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Monday 16 June 2025 17:25:17 -0400 (0:00:00.258) 0:18:24.077 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Monday 16 June 2025 17:25:17 -0400 (0:00:00.227) 0:18:24.305 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Monday 16 June 2025 17:25:17 -0400 (0:00:00.174) 0:18:24.479 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Monday 16 June 2025 17:25:18 -0400 (0:00:00.343) 0:18:24.822 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:99 Monday 16 June 2025 17:25:18 -0400 (0:00:00.205) 0:18:25.027 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node11 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Monday 16 June 2025 17:25:18 -0400 (0:00:00.575) 0:18:25.602 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node11 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Monday 16 June 2025 17:25:19 -0400 (0:00:00.471) 0:18:26.074 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Monday 16 June 2025 17:25:19 -0400 (0:00:00.329) 0:18:26.404 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Monday 16 June 2025 17:25:20 -0400 (0:00:00.334) 0:18:26.739 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Monday 16 June 2025 17:25:20 -0400 (0:00:00.355) 0:18:27.094 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Monday 16 June 2025 17:25:20 -0400 (0:00:00.278) 0:18:27.373 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Monday 16 June 2025 17:25:20 -0400 (0:00:00.230) 0:18:27.603 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Monday 16 June 2025 17:25:21 -0400 (0:00:00.385) 0:18:27.988 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:102 Monday 16 June 2025 17:25:21 -0400 (0:00:00.201) 0:18:28.190 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node11 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Monday 16 June 2025 17:25:22 -0400 (0:00:00.526) 0:18:28.717 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Monday 16 June 2025 17:25:22 -0400 (0:00:00.375) 0:18:29.092 *********** skipping: [managed-node11] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Monday 16 June 2025 17:25:22 -0400 (0:00:00.371) 0:18:29.463 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Monday 16 June 2025 17:25:23 -0400 (0:00:00.297) 0:18:29.761 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Monday 16 June 2025 17:25:24 -0400 (0:00:01.034) 0:18:30.795 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Monday 16 June 2025 17:25:24 -0400 (0:00:00.260) 0:18:31.056 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Monday 16 June 2025 17:25:24 -0400 (0:00:00.256) 0:18:31.312 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:105 Monday 16 June 2025 17:25:25 -0400 (0:00:00.362) 0:18:31.674 *********** ok: [managed-node11] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Monday 16 June 2025 17:25:25 -0400 (0:00:00.325) 0:18:31.999 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node11 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Monday 16 June 2025 17:25:25 -0400 (0:00:00.529) 0:18:32.528 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Monday 16 June 2025 17:25:26 -0400 (0:00:00.196) 0:18:32.725 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node11 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Monday 16 June 2025 17:25:27 -0400 (0:00:01.062) 0:18:33.788 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Monday 16 June 2025 17:25:27 -0400 (0:00:00.144) 0:18:33.932 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Monday 16 June 2025 17:25:27 -0400 (0:00:00.313) 0:18:34.246 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Monday 16 June 2025 17:25:28 -0400 (0:00:00.461) 0:18:34.707 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Monday 16 June 2025 17:25:28 -0400 (0:00:00.205) 0:18:34.912 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Monday 16 June 2025 17:25:28 -0400 (0:00:00.242) 0:18:35.154 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Monday 16 June 2025 17:25:28 -0400 (0:00:00.419) 0:18:35.574 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Monday 16 June 2025 17:25:29 -0400 (0:00:00.278) 0:18:35.852 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Monday 16 June 2025 17:25:29 -0400 (0:00:00.280) 0:18:36.133 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Monday 16 June 2025 17:25:29 -0400 (0:00:00.293) 0:18:36.426 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Monday 16 June 2025 17:25:30 -0400 (0:00:00.294) 0:18:36.721 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Monday 16 June 2025 17:25:30 -0400 (0:00:00.361) 0:18:37.082 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Monday 16 June 2025 17:25:31 -0400 (0:00:00.637) 0:18:37.720 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Monday 16 June 2025 17:25:31 -0400 (0:00:00.276) 0:18:37.997 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Monday 16 June 2025 17:25:31 -0400 (0:00:00.276) 0:18:38.273 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Monday 16 June 2025 17:25:31 -0400 (0:00:00.301) 0:18:38.575 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Monday 16 June 2025 17:25:32 -0400 (0:00:00.270) 0:18:38.845 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Monday 16 June 2025 17:25:32 -0400 (0:00:00.295) 0:18:39.140 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Monday 16 June 2025 17:25:32 -0400 (0:00:00.399) 0:18:39.540 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Monday 16 June 2025 17:25:33 -0400 (0:00:00.519) 0:18:40.059 *********** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1750109013.971991, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1750108941.6844122, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 230514, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1750108941.6844122, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Monday 16 June 2025 17:25:35 -0400 (0:00:01.610) 0:18:41.669 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Monday 16 June 2025 17:25:35 -0400 (0:00:00.319) 0:18:41.988 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Monday 16 June 2025 17:25:35 -0400 (0:00:00.331) 0:18:42.320 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Monday 16 June 2025 17:25:36 -0400 (0:00:00.324) 0:18:42.644 *********** ok: [managed-node11] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Monday 16 June 2025 17:25:36 -0400 (0:00:00.316) 0:18:42.961 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Monday 16 June 2025 17:25:36 -0400 (0:00:00.362) 0:18:43.323 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Monday 16 June 2025 17:25:36 -0400 (0:00:00.291) 0:18:43.614 *********** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1750109071.9324555, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1750108941.8294134, "dev": 6, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 229862, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1750108941.8294134, "nlink": 1, "path": "/dev/mapper/luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Monday 16 June 2025 17:25:38 -0400 (0:00:01.988) 0:18:45.602 *********** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Monday 16 June 2025 17:25:43 -0400 (0:00:04.793) 0:18:50.396 *********** ok: [managed-node11] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.010299", "end": "2025-06-16 17:25:45.190552", "rc": 0, "start": "2025-06-16 17:25:45.180253" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 923141 Threads: 2 Salt: e8 8f 0a de c6 6c 22 50 44 1a 8c 0a c3 c1 81 25 40 10 a6 a0 81 20 e0 f7 49 55 5e 29 b9 0d 25 00 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 120470 Salt: 76 fa e1 90 60 ed fc 3a 0f 96 77 81 96 f9 4d 3e d5 21 95 f4 b6 be d5 e6 73 c1 9a 8f 32 4c bb db Digest: d6 54 14 36 6e de 3b 3e b3 a9 e8 29 c3 9b d6 3f 90 86 f1 1f 35 d1 d1 31 14 40 64 4a 8f a5 3c aa TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Monday 16 June 2025 17:25:45 -0400 (0:00:01.686) 0:18:52.082 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Monday 16 June 2025 17:25:45 -0400 (0:00:00.387) 0:18:52.470 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Monday 16 June 2025 17:25:46 -0400 (0:00:00.393) 0:18:52.863 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Monday 16 June 2025 17:25:46 -0400 (0:00:00.362) 0:18:53.225 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Monday 16 June 2025 17:25:46 -0400 (0:00:00.290) 0:18:53.516 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Monday 16 June 2025 17:25:47 -0400 (0:00:00.452) 0:18:53.968 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Monday 16 June 2025 17:25:47 -0400 (0:00:00.264) 0:18:54.233 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Monday 16 June 2025 17:25:47 -0400 (0:00:00.312) 0:18:54.545 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0 /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Monday 16 June 2025 17:25:48 -0400 (0:00:00.323) 0:18:54.869 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Monday 16 June 2025 17:25:48 -0400 (0:00:00.380) 0:18:55.250 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Monday 16 June 2025 17:25:48 -0400 (0:00:00.278) 0:18:55.528 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Monday 16 June 2025 17:25:49 -0400 (0:00:00.309) 0:18:55.837 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Monday 16 June 2025 17:25:49 -0400 (0:00:00.298) 0:18:56.136 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Monday 16 June 2025 17:25:49 -0400 (0:00:00.319) 0:18:56.455 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Monday 16 June 2025 17:25:50 -0400 (0:00:00.276) 0:18:56.731 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Monday 16 June 2025 17:25:50 -0400 (0:00:00.277) 0:18:57.009 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Monday 16 June 2025 17:25:50 -0400 (0:00:00.299) 0:18:57.309 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Monday 16 June 2025 17:25:50 -0400 (0:00:00.279) 0:18:57.589 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Monday 16 June 2025 17:25:51 -0400 (0:00:00.292) 0:18:57.881 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Monday 16 June 2025 17:25:51 -0400 (0:00:00.342) 0:18:58.224 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Monday 16 June 2025 17:25:51 -0400 (0:00:00.320) 0:18:58.544 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Monday 16 June 2025 17:25:52 -0400 (0:00:00.239) 0:18:58.784 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Monday 16 June 2025 17:25:52 -0400 (0:00:00.377) 0:18:59.161 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Monday 16 June 2025 17:25:52 -0400 (0:00:00.374) 0:18:59.536 *********** ok: [managed-node11] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Monday 16 June 2025 17:25:54 -0400 (0:00:01.619) 0:19:01.155 *********** ok: [managed-node11] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Monday 16 June 2025 17:25:56 -0400 (0:00:01.488) 0:19:02.644 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Monday 16 June 2025 17:25:56 -0400 (0:00:00.331) 0:19:02.975 *********** ok: [managed-node11] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Monday 16 June 2025 17:25:56 -0400 (0:00:00.180) 0:19:03.156 *********** ok: [managed-node11] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Monday 16 June 2025 17:25:58 -0400 (0:00:01.736) 0:19:04.893 *********** skipping: [managed-node11] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Monday 16 June 2025 17:25:58 -0400 (0:00:00.281) 0:19:05.175 *********** skipping: [managed-node11] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Monday 16 June 2025 17:25:58 -0400 (0:00:00.302) 0:19:05.478 *********** skipping: [managed-node11] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Monday 16 June 2025 17:25:59 -0400 (0:00:00.281) 0:19:05.760 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Monday 16 June 2025 17:25:59 -0400 (0:00:00.445) 0:19:06.205 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Monday 16 June 2025 17:25:59 -0400 (0:00:00.291) 0:19:06.496 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Monday 16 June 2025 17:26:00 -0400 (0:00:00.266) 0:19:06.763 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Monday 16 June 2025 17:26:00 -0400 (0:00:00.356) 0:19:07.119 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Monday 16 June 2025 17:26:00 -0400 (0:00:00.287) 0:19:07.407 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Monday 16 June 2025 17:26:01 -0400 (0:00:00.294) 0:19:07.702 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Monday 16 June 2025 17:26:01 -0400 (0:00:00.372) 0:19:08.074 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Monday 16 June 2025 17:26:01 -0400 (0:00:00.235) 0:19:08.309 *********** skipping: [managed-node11] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Monday 16 June 2025 17:26:01 -0400 (0:00:00.259) 0:19:08.569 *********** skipping: [managed-node11] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Monday 16 June 2025 17:26:02 -0400 (0:00:00.293) 0:19:08.862 *********** skipping: [managed-node11] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Monday 16 June 2025 17:26:02 -0400 (0:00:00.226) 0:19:09.089 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Monday 16 June 2025 17:26:02 -0400 (0:00:00.271) 0:19:09.360 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Monday 16 June 2025 17:26:03 -0400 (0:00:00.307) 0:19:09.668 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Monday 16 June 2025 17:26:03 -0400 (0:00:00.254) 0:19:09.922 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Monday 16 June 2025 17:26:04 -0400 (0:00:00.895) 0:19:10.818 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Monday 16 June 2025 17:26:04 -0400 (0:00:00.305) 0:19:11.123 *********** ok: [managed-node11] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Monday 16 June 2025 17:26:04 -0400 (0:00:00.314) 0:19:11.437 *********** ok: [managed-node11] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Monday 16 June 2025 17:26:05 -0400 (0:00:00.294) 0:19:11.732 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Monday 16 June 2025 17:26:05 -0400 (0:00:00.410) 0:19:12.142 *********** ok: [managed-node11] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.023311", "end": "2025-06-16 17:26:06.904579", "rc": 0, "start": "2025-06-16 17:26:06.881268" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Monday 16 June 2025 17:26:07 -0400 (0:00:01.713) 0:19:13.856 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Monday 16 June 2025 17:26:07 -0400 (0:00:00.254) 0:19:14.111 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Monday 16 June 2025 17:26:07 -0400 (0:00:00.293) 0:19:14.404 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Monday 16 June 2025 17:26:08 -0400 (0:00:00.417) 0:19:14.822 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Monday 16 June 2025 17:26:08 -0400 (0:00:00.372) 0:19:15.194 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Monday 16 June 2025 17:26:08 -0400 (0:00:00.235) 0:19:15.429 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Monday 16 June 2025 17:26:09 -0400 (0:00:00.293) 0:19:15.723 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Monday 16 June 2025 17:26:09 -0400 (0:00:00.217) 0:19:15.941 *********** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Monday 16 June 2025 17:26:09 -0400 (0:00:00.242) 0:19:16.183 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Monday 16 June 2025 17:26:09 -0400 (0:00:00.314) 0:19:16.498 *********** changed: [managed-node11] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:436 Monday 16 June 2025 17:26:11 -0400 (0:00:01.741) 0:19:18.239 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node11 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Monday 16 June 2025 17:26:12 -0400 (0:00:00.484) 0:19:18.724 *********** ok: [managed-node11] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Monday 16 June 2025 17:26:12 -0400 (0:00:00.300) 0:19:19.024 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 16 June 2025 17:26:12 -0400 (0:00:00.362) 0:19:19.387 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 16 June 2025 17:26:13 -0400 (0:00:00.334) 0:19:19.721 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 16 June 2025 17:26:13 -0400 (0:00:00.351) 0:19:20.073 *********** skipping: [managed-node11] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node11] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Monday 16 June 2025 17:26:14 -0400 (0:00:00.576) 0:19:20.649 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Monday 16 June 2025 17:26:14 -0400 (0:00:00.249) 0:19:20.898 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 16 June 2025 17:26:14 -0400 (0:00:00.260) 0:19:21.159 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 16 June 2025 17:26:14 -0400 (0:00:00.217) 0:19:21.377 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 16 June 2025 17:26:14 -0400 (0:00:00.226) 0:19:21.604 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Monday 16 June 2025 17:26:15 -0400 (0:00:00.563) 0:19:22.167 *********** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 16 June 2025 17:26:19 -0400 (0:00:04.315) 0:19:26.483 *********** ok: [managed-node11] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": false, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Monday 16 June 2025 17:26:20 -0400 (0:00:00.282) 0:19:26.765 *********** ok: [managed-node11] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Monday 16 June 2025 17:26:20 -0400 (0:00:00.195) 0:19:26.961 *********** ok: [managed-node11] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 16 June 2025 17:26:26 -0400 (0:00:05.681) 0:19:32.642 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 16 June 2025 17:26:26 -0400 (0:00:00.479) 0:19:33.121 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 16 June 2025 17:26:26 -0400 (0:00:00.217) 0:19:33.339 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 16 June 2025 17:26:27 -0400 (0:00:00.882) 0:19:34.221 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Monday 16 June 2025 17:26:27 -0400 (0:00:00.204) 0:19:34.425 *********** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Monday 16 June 2025 17:26:32 -0400 (0:00:04.855) 0:19:39.281 *********** ok: [managed-node11] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service": { "name": "systemd-cryptsetup@luk...d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2d81fa0ad9\\x2d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service": { "name": "systemd-cryptsetup@luks\\x2d81fa0ad9\\x2d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Monday 16 June 2025 17:26:35 -0400 (0:00:03.000) 0:19:42.281 *********** ok: [managed-node11] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d81fa0ad9\\x2d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "systemd-cryptsetup@luk...d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Monday 16 June 2025 17:26:36 -0400 (0:00:00.355) 0:19:42.637 *********** changed: [managed-node11] => (item=systemd-cryptsetup@luks\x2d81fa0ad9\x2d7bd0\x2d4113\x2d88f7\x2d8e5459e0d1d0.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d81fa0ad9\\x2d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "name": "systemd-cryptsetup@luks\\x2d81fa0ad9\\x2d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-journald.socket system-systemd\\x2dcryptsetup.slice dev-mapper-foo\\x2dtest1.device cryptsetup-pre.target", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-mapper-foo\\x2dtest1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0 /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d81fa0ad9\\x2d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d81fa0ad9\\x2d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d81fa0ad9\\x2d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "dev-mapper-luks\\x2d81fa0ad9\\x2d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.device cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Mon 2025-06-16 17:24:26 EDT", "StateChangeTimestampMonotonic": "2702198719", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node11] => (item=systemd-cryptsetup@luk...d7bd0\x2d4113\x2d88f7\x2d8e5459e0d1d0.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "name": "systemd-cryptsetup@luk...d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Monday 16 June 2025 17:26:39 -0400 (0:00:03.255) 0:19:45.892 *********** fatal: [managed-node11]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Monday 16 June 2025 17:26:44 -0400 (0:00:05.714) 0:19:51.606 *********** fatal: [managed-node11]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0' in safe mode due to encryption removal", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 16 June 2025 17:26:45 -0400 (0:00:00.407) 0:19:52.014 *********** changed: [managed-node11] => (item=systemd-cryptsetup@luks\x2d81fa0ad9\x2d7bd0\x2d4113\x2d88f7\x2d8e5459e0d1d0.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d81fa0ad9\\x2d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "name": "systemd-cryptsetup@luks\\x2d81fa0ad9\\x2d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d81fa0ad9\\x2d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d81fa0ad9\\x2d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d81fa0ad9\\x2d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d81fa0ad9\\x2d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "RequiredBy": "dev-mapper-luks\\x2d81fa0ad9\\x2d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.device cryptsetup.target", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Mon 2025-06-16 17:24:26 EDT", "StateChangeTimestampMonotonic": "2702198719", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node11] => (item=systemd-cryptsetup@luk...d7bd0\x2d4113\x2d88f7\x2d8e5459e0d1d0.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "name": "systemd-cryptsetup@luk...d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Monday 16 June 2025 17:26:49 -0400 (0:00:03.887) 0:19:55.901 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Monday 16 June 2025 17:26:49 -0400 (0:00:00.274) 0:19:56.176 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Monday 16 June 2025 17:26:49 -0400 (0:00:00.240) 0:19:56.417 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Monday 16 June 2025 17:26:49 -0400 (0:00:00.175) 0:19:56.592 *********** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1750109171.1862519, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1750109171.1862519, "dev": 64769, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1750109171.1862519, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "4071147487", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Monday 16 June 2025 17:26:51 -0400 (0:00:01.611) 0:19:58.203 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:460 Monday 16 June 2025 17:26:51 -0400 (0:00:00.171) 0:19:58.375 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 16 June 2025 17:26:52 -0400 (0:00:00.621) 0:19:58.997 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 16 June 2025 17:26:52 -0400 (0:00:00.358) 0:19:59.355 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 16 June 2025 17:26:53 -0400 (0:00:00.314) 0:19:59.670 *********** skipping: [managed-node11] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node11] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Monday 16 June 2025 17:26:53 -0400 (0:00:00.721) 0:20:00.391 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Monday 16 June 2025 17:26:54 -0400 (0:00:00.264) 0:20:00.656 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 16 June 2025 17:26:54 -0400 (0:00:00.169) 0:20:00.826 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 16 June 2025 17:26:54 -0400 (0:00:00.216) 0:20:01.043 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 16 June 2025 17:26:54 -0400 (0:00:00.187) 0:20:01.230 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Monday 16 June 2025 17:26:55 -0400 (0:00:00.497) 0:20:01.728 *********** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 16 June 2025 17:26:59 -0400 (0:00:04.827) 0:20:06.556 *********** ok: [managed-node11] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": false, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Monday 16 June 2025 17:27:00 -0400 (0:00:00.310) 0:20:06.867 *********** ok: [managed-node11] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Monday 16 June 2025 17:27:00 -0400 (0:00:00.271) 0:20:07.139 *********** ok: [managed-node11] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 16 June 2025 17:27:05 -0400 (0:00:05.370) 0:20:12.509 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 16 June 2025 17:27:06 -0400 (0:00:00.437) 0:20:12.947 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 16 June 2025 17:27:06 -0400 (0:00:00.313) 0:20:13.260 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 16 June 2025 17:27:06 -0400 (0:00:00.364) 0:20:13.624 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Monday 16 June 2025 17:27:07 -0400 (0:00:00.135) 0:20:13.759 *********** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Monday 16 June 2025 17:27:11 -0400 (0:00:04.826) 0:20:18.586 *********** ok: [managed-node11] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service": { "name": "systemd-cryptsetup@luk...d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2d81fa0ad9\\x2d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service": { "name": "systemd-cryptsetup@luks\\x2d81fa0ad9\\x2d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Monday 16 June 2025 17:27:14 -0400 (0:00:02.779) 0:20:21.366 *********** ok: [managed-node11] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d81fa0ad9\\x2d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "systemd-cryptsetup@luk...d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Monday 16 June 2025 17:27:15 -0400 (0:00:00.305) 0:20:21.671 *********** changed: [managed-node11] => (item=systemd-cryptsetup@luks\x2d81fa0ad9\x2d7bd0\x2d4113\x2d88f7\x2d8e5459e0d1d0.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d81fa0ad9\\x2d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "name": "systemd-cryptsetup@luks\\x2d81fa0ad9\\x2d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-journald.socket system-systemd\\x2dcryptsetup.slice dev-mapper-foo\\x2dtest1.device cryptsetup-pre.target", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-mapper-foo\\x2dtest1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0 /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d81fa0ad9\\x2d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d81fa0ad9\\x2d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d81fa0ad9\\x2d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2d81fa0ad9\\x2d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.device", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Mon 2025-06-16 17:24:26 EDT", "StateChangeTimestampMonotonic": "2702198719", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node11] => (item=systemd-cryptsetup@luk...d7bd0\x2d4113\x2d88f7\x2d8e5459e0d1d0.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "name": "systemd-cryptsetup@luk...d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Monday 16 June 2025 17:27:17 -0400 (0:00:02.822) 0:20:24.494 *********** changed: [managed-node11] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0", "password": "-", "state": "absent" } ], "leaves": [ "/dev/mapper/foo-test1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Monday 16 June 2025 17:27:23 -0400 (0:00:06.117) 0:20:30.612 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Monday 16 June 2025 17:27:24 -0400 (0:00:00.219) 0:20:30.832 *********** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1750108951.4554906, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "af13c3a106572cc405a934ce35bd466146b20c74", "ctime": 1750108951.4524906, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 364904586, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1750108951.4524906, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "4217545767", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Monday 16 June 2025 17:27:25 -0400 (0:00:01.633) 0:20:32.465 *********** ok: [managed-node11] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 16 June 2025 17:27:27 -0400 (0:00:01.311) 0:20:33.777 *********** changed: [managed-node11] => (item=systemd-cryptsetup@luks\x2d81fa0ad9\x2d7bd0\x2d4113\x2d88f7\x2d8e5459e0d1d0.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d81fa0ad9\\x2d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "name": "systemd-cryptsetup@luks\\x2d81fa0ad9\\x2d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d81fa0ad9\\x2d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d81fa0ad9\\x2d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d81fa0ad9\\x2d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d81fa0ad9\\x2d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "RequiredBy": "dev-mapper-luks\\x2d81fa0ad9\\x2d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.device cryptsetup.target", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Mon 2025-06-16 17:24:26 EDT", "StateChangeTimestampMonotonic": "2702198719", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node11] => (item=systemd-cryptsetup@luk...d7bd0\x2d4113\x2d88f7\x2d8e5459e0d1d0.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "name": "systemd-cryptsetup@luk...d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Monday 16 June 2025 17:27:30 -0400 (0:00:03.424) 0:20:37.201 *********** ok: [managed-node11] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/mapper/foo-test1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Monday 16 June 2025 17:27:30 -0400 (0:00:00.323) 0:20:37.524 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Monday 16 June 2025 17:27:31 -0400 (0:00:00.294) 0:20:37.819 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Monday 16 June 2025 17:27:31 -0400 (0:00:00.176) 0:20:37.996 *********** changed: [managed-node11] => (item={'src': '/dev/mapper/luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Monday 16 June 2025 17:27:33 -0400 (0:00:01.717) 0:20:39.713 *********** ok: [managed-node11] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Monday 16 June 2025 17:27:34 -0400 (0:00:01.824) 0:20:41.538 *********** changed: [managed-node11] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Monday 16 June 2025 17:27:36 -0400 (0:00:01.503) 0:20:43.042 *********** skipping: [managed-node11] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Monday 16 June 2025 17:27:36 -0400 (0:00:00.255) 0:20:43.297 *********** ok: [managed-node11] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Monday 16 June 2025 17:27:38 -0400 (0:00:01.884) 0:20:45.182 *********** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1750108966.248609, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "692f28302299cbc7919c026232b557c294a53d1d", "ctime": 1750108958.1075437, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 471859331, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1750108958.1065438, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "2312515694", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Monday 16 June 2025 17:27:39 -0400 (0:00:01.392) 0:20:46.575 *********** changed: [managed-node11] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Monday 16 June 2025 17:27:41 -0400 (0:00:01.348) 0:20:47.923 *********** ok: [managed-node11] TASK [Verify role results] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:477 Monday 16 June 2025 17:27:43 -0400 (0:00:02.231) 0:20:50.154 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node11 TASK [Print out pool information] ********************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Monday 16 June 2025 17:27:44 -0400 (0:00:00.511) 0:20:50.666 *********** ok: [managed-node11] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Monday 16 June 2025 17:27:44 -0400 (0:00:00.289) 0:20:50.956 *********** skipping: [managed-node11] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Monday 16 June 2025 17:27:44 -0400 (0:00:00.258) 0:20:51.215 *********** ok: [managed-node11] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "d1c89885-1c05-42a6-8612-c834d4c9f8db" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "al0uCf-R2qd-eaR5-lcbe-8d4g-1JeN-2yYBO1" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Monday 16 June 2025 17:27:46 -0400 (0:00:01.486) 0:20:52.701 *********** ok: [managed-node11] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003255", "end": "2025-06-16 17:27:47.593878", "rc": 0, "start": "2025-06-16 17:27:47.590623" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Monday 16 June 2025 17:27:47 -0400 (0:00:01.860) 0:20:54.562 *********** ok: [managed-node11] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002849", "end": "2025-06-16 17:27:49.192812", "failed_when_result": false, "rc": 0, "start": "2025-06-16 17:27:49.189963" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Monday 16 June 2025 17:27:49 -0400 (0:00:01.553) 0:20:56.116 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node11 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Monday 16 June 2025 17:27:50 -0400 (0:00:00.601) 0:20:56.718 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Monday 16 June 2025 17:27:50 -0400 (0:00:00.250) 0:20:56.968 *********** ok: [managed-node11] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.025038", "end": "2025-06-16 17:27:51.676003", "rc": 0, "start": "2025-06-16 17:27:51.650965" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Monday 16 June 2025 17:27:52 -0400 (0:00:01.720) 0:20:58.689 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Monday 16 June 2025 17:27:52 -0400 (0:00:00.443) 0:20:59.132 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node11 TASK [Set test variables] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Monday 16 June 2025 17:27:53 -0400 (0:00:00.597) 0:20:59.730 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Monday 16 June 2025 17:27:53 -0400 (0:00:00.410) 0:21:00.140 *********** ok: [managed-node11] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Monday 16 June 2025 17:27:55 -0400 (0:00:01.657) 0:21:01.798 *********** ok: [managed-node11] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Monday 16 June 2025 17:27:55 -0400 (0:00:00.381) 0:21:02.179 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Monday 16 June 2025 17:27:55 -0400 (0:00:00.198) 0:21:02.377 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Monday 16 June 2025 17:27:56 -0400 (0:00:00.282) 0:21:02.659 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Monday 16 June 2025 17:27:56 -0400 (0:00:00.335) 0:21:02.995 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Monday 16 June 2025 17:27:56 -0400 (0:00:00.203) 0:21:03.198 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:54 Monday 16 June 2025 17:27:56 -0400 (0:00:00.291) 0:21:03.489 *********** ok: [managed-node11] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:67 Monday 16 June 2025 17:27:57 -0400 (0:00:00.360) 0:21:03.850 *********** ok: [managed-node11] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.14.162 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:77 Monday 16 June 2025 17:27:58 -0400 (0:00:01.556) 0:21:05.406 *********** skipping: [managed-node11] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:87 Monday 16 June 2025 17:27:59 -0400 (0:00:00.265) 0:21:05.671 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node11 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Monday 16 June 2025 17:27:59 -0400 (0:00:00.608) 0:21:06.280 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Monday 16 June 2025 17:27:59 -0400 (0:00:00.337) 0:21:06.617 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Monday 16 June 2025 17:28:00 -0400 (0:00:00.225) 0:21:06.842 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Monday 16 June 2025 17:28:00 -0400 (0:00:00.378) 0:21:07.220 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Monday 16 June 2025 17:28:00 -0400 (0:00:00.241) 0:21:07.462 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Monday 16 June 2025 17:28:01 -0400 (0:00:00.340) 0:21:07.803 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Monday 16 June 2025 17:28:01 -0400 (0:00:00.238) 0:21:08.041 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Monday 16 June 2025 17:28:02 -0400 (0:00:00.917) 0:21:08.958 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Monday 16 June 2025 17:28:02 -0400 (0:00:00.269) 0:21:09.228 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Monday 16 June 2025 17:28:02 -0400 (0:00:00.241) 0:21:09.469 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Monday 16 June 2025 17:28:03 -0400 (0:00:00.231) 0:21:09.701 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:90 Monday 16 June 2025 17:28:03 -0400 (0:00:00.202) 0:21:09.903 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node11 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Monday 16 June 2025 17:28:03 -0400 (0:00:00.507) 0:21:10.411 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node11 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Monday 16 June 2025 17:28:04 -0400 (0:00:00.576) 0:21:10.988 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Monday 16 June 2025 17:28:04 -0400 (0:00:00.320) 0:21:11.308 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Monday 16 June 2025 17:28:04 -0400 (0:00:00.244) 0:21:11.552 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Monday 16 June 2025 17:28:05 -0400 (0:00:00.177) 0:21:11.729 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Monday 16 June 2025 17:28:05 -0400 (0:00:00.196) 0:21:11.926 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Monday 16 June 2025 17:28:05 -0400 (0:00:00.313) 0:21:12.240 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Monday 16 June 2025 17:28:05 -0400 (0:00:00.345) 0:21:12.586 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:93 Monday 16 June 2025 17:28:06 -0400 (0:00:00.324) 0:21:12.910 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node11 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Monday 16 June 2025 17:28:06 -0400 (0:00:00.544) 0:21:13.455 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node11 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Monday 16 June 2025 17:28:07 -0400 (0:00:00.554) 0:21:14.009 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Monday 16 June 2025 17:28:07 -0400 (0:00:00.393) 0:21:14.403 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Monday 16 June 2025 17:28:08 -0400 (0:00:00.284) 0:21:14.688 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Monday 16 June 2025 17:28:08 -0400 (0:00:00.293) 0:21:14.982 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:96 Monday 16 June 2025 17:28:08 -0400 (0:00:00.316) 0:21:15.298 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node11 TASK [Set test variables] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Monday 16 June 2025 17:28:09 -0400 (0:00:00.600) 0:21:15.899 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Monday 16 June 2025 17:28:09 -0400 (0:00:00.249) 0:21:16.149 *********** skipping: [managed-node11] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Monday 16 June 2025 17:28:09 -0400 (0:00:00.262) 0:21:16.411 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node11 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Monday 16 June 2025 17:28:10 -0400 (0:00:00.446) 0:21:16.858 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Monday 16 June 2025 17:28:10 -0400 (0:00:00.426) 0:21:17.285 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Monday 16 June 2025 17:28:10 -0400 (0:00:00.322) 0:21:17.608 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Monday 16 June 2025 17:28:11 -0400 (0:00:00.312) 0:21:17.920 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Monday 16 June 2025 17:28:11 -0400 (0:00:00.382) 0:21:18.302 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Monday 16 June 2025 17:28:12 -0400 (0:00:00.331) 0:21:18.634 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Monday 16 June 2025 17:28:12 -0400 (0:00:00.232) 0:21:18.866 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:99 Monday 16 June 2025 17:28:12 -0400 (0:00:00.266) 0:21:19.133 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node11 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Monday 16 June 2025 17:28:13 -0400 (0:00:00.558) 0:21:19.691 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node11 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Monday 16 June 2025 17:28:13 -0400 (0:00:00.544) 0:21:20.235 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Monday 16 June 2025 17:28:13 -0400 (0:00:00.255) 0:21:20.491 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Monday 16 June 2025 17:28:14 -0400 (0:00:00.257) 0:21:20.748 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Monday 16 June 2025 17:28:14 -0400 (0:00:00.319) 0:21:21.068 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Monday 16 June 2025 17:28:14 -0400 (0:00:00.244) 0:21:21.312 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Monday 16 June 2025 17:28:14 -0400 (0:00:00.262) 0:21:21.575 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Monday 16 June 2025 17:28:15 -0400 (0:00:00.639) 0:21:22.215 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:102 Monday 16 June 2025 17:28:15 -0400 (0:00:00.180) 0:21:22.395 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node11 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Monday 16 June 2025 17:28:16 -0400 (0:00:00.549) 0:21:22.945 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Monday 16 June 2025 17:28:16 -0400 (0:00:00.294) 0:21:23.240 *********** skipping: [managed-node11] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Monday 16 June 2025 17:28:16 -0400 (0:00:00.239) 0:21:23.479 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Monday 16 June 2025 17:28:17 -0400 (0:00:00.275) 0:21:23.755 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Monday 16 June 2025 17:28:17 -0400 (0:00:00.297) 0:21:24.052 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Monday 16 June 2025 17:28:17 -0400 (0:00:00.219) 0:21:24.272 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Monday 16 June 2025 17:28:18 -0400 (0:00:00.374) 0:21:24.646 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:105 Monday 16 June 2025 17:28:18 -0400 (0:00:00.213) 0:21:24.860 *********** ok: [managed-node11] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Monday 16 June 2025 17:28:18 -0400 (0:00:00.272) 0:21:25.133 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node11 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Monday 16 June 2025 17:28:19 -0400 (0:00:00.527) 0:21:25.660 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Monday 16 June 2025 17:28:19 -0400 (0:00:00.286) 0:21:25.946 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node11 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Monday 16 June 2025 17:28:20 -0400 (0:00:01.381) 0:21:27.328 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Monday 16 June 2025 17:28:20 -0400 (0:00:00.254) 0:21:27.583 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Monday 16 June 2025 17:28:21 -0400 (0:00:00.287) 0:21:27.870 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Monday 16 June 2025 17:28:21 -0400 (0:00:00.341) 0:21:28.212 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Monday 16 June 2025 17:28:21 -0400 (0:00:00.294) 0:21:28.507 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Monday 16 June 2025 17:28:22 -0400 (0:00:00.270) 0:21:28.777 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Monday 16 June 2025 17:28:22 -0400 (0:00:00.185) 0:21:28.962 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Monday 16 June 2025 17:28:22 -0400 (0:00:00.266) 0:21:29.229 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Monday 16 June 2025 17:28:22 -0400 (0:00:00.188) 0:21:29.417 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Monday 16 June 2025 17:28:23 -0400 (0:00:00.419) 0:21:29.837 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Monday 16 June 2025 17:28:23 -0400 (0:00:00.256) 0:21:30.093 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Monday 16 June 2025 17:28:23 -0400 (0:00:00.225) 0:21:30.319 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Monday 16 June 2025 17:28:24 -0400 (0:00:00.544) 0:21:30.864 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Monday 16 June 2025 17:28:24 -0400 (0:00:00.360) 0:21:31.224 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Monday 16 June 2025 17:28:24 -0400 (0:00:00.363) 0:21:31.587 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Monday 16 June 2025 17:28:25 -0400 (0:00:00.366) 0:21:31.954 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Monday 16 June 2025 17:28:25 -0400 (0:00:00.383) 0:21:32.338 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Monday 16 June 2025 17:28:25 -0400 (0:00:00.225) 0:21:32.564 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Monday 16 June 2025 17:28:26 -0400 (0:00:00.444) 0:21:33.008 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Monday 16 June 2025 17:28:26 -0400 (0:00:00.374) 0:21:33.382 *********** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1750109243.6588333, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1750109243.6588333, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 259938, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1750109243.6588333, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Monday 16 June 2025 17:28:28 -0400 (0:00:01.837) 0:21:35.220 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Monday 16 June 2025 17:28:28 -0400 (0:00:00.386) 0:21:35.607 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Monday 16 June 2025 17:28:29 -0400 (0:00:00.295) 0:21:35.902 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Monday 16 June 2025 17:28:29 -0400 (0:00:00.240) 0:21:36.143 *********** ok: [managed-node11] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Monday 16 June 2025 17:28:29 -0400 (0:00:00.321) 0:21:36.465 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Monday 16 June 2025 17:28:30 -0400 (0:00:00.284) 0:21:36.750 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Monday 16 June 2025 17:28:30 -0400 (0:00:00.216) 0:21:36.967 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Monday 16 June 2025 17:28:30 -0400 (0:00:00.286) 0:21:37.253 *********** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Monday 16 June 2025 17:28:35 -0400 (0:00:04.707) 0:21:41.960 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Monday 16 June 2025 17:28:35 -0400 (0:00:00.206) 0:21:42.167 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Monday 16 June 2025 17:28:35 -0400 (0:00:00.183) 0:21:42.351 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Monday 16 June 2025 17:28:36 -0400 (0:00:00.306) 0:21:42.657 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Monday 16 June 2025 17:28:36 -0400 (0:00:00.182) 0:21:42.840 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Monday 16 June 2025 17:28:36 -0400 (0:00:00.125) 0:21:42.967 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Monday 16 June 2025 17:28:36 -0400 (0:00:00.200) 0:21:43.167 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Monday 16 June 2025 17:28:36 -0400 (0:00:00.200) 0:21:43.367 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Monday 16 June 2025 17:28:36 -0400 (0:00:00.215) 0:21:43.583 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Monday 16 June 2025 17:28:37 -0400 (0:00:00.381) 0:21:43.965 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Monday 16 June 2025 17:28:38 -0400 (0:00:00.746) 0:21:44.711 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Monday 16 June 2025 17:28:38 -0400 (0:00:00.278) 0:21:44.990 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Monday 16 June 2025 17:28:38 -0400 (0:00:00.178) 0:21:45.168 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Monday 16 June 2025 17:28:38 -0400 (0:00:00.127) 0:21:45.295 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Monday 16 June 2025 17:28:38 -0400 (0:00:00.279) 0:21:45.575 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Monday 16 June 2025 17:28:39 -0400 (0:00:00.265) 0:21:45.840 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Monday 16 June 2025 17:28:39 -0400 (0:00:00.199) 0:21:46.040 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Monday 16 June 2025 17:28:39 -0400 (0:00:00.301) 0:21:46.342 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Monday 16 June 2025 17:28:40 -0400 (0:00:00.288) 0:21:46.631 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Monday 16 June 2025 17:28:40 -0400 (0:00:00.290) 0:21:46.921 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Monday 16 June 2025 17:28:40 -0400 (0:00:00.263) 0:21:47.184 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Monday 16 June 2025 17:28:40 -0400 (0:00:00.220) 0:21:47.405 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Monday 16 June 2025 17:28:41 -0400 (0:00:00.261) 0:21:47.667 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Monday 16 June 2025 17:28:41 -0400 (0:00:00.182) 0:21:47.850 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Monday 16 June 2025 17:28:41 -0400 (0:00:00.238) 0:21:48.088 *********** ok: [managed-node11] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Monday 16 June 2025 17:28:43 -0400 (0:00:01.709) 0:21:49.798 *********** ok: [managed-node11] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Monday 16 June 2025 17:28:44 -0400 (0:00:01.349) 0:21:51.147 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Monday 16 June 2025 17:28:44 -0400 (0:00:00.320) 0:21:51.467 *********** ok: [managed-node11] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Monday 16 June 2025 17:28:45 -0400 (0:00:00.255) 0:21:51.723 *********** ok: [managed-node11] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Monday 16 June 2025 17:28:46 -0400 (0:00:01.421) 0:21:53.144 *********** skipping: [managed-node11] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Monday 16 June 2025 17:28:46 -0400 (0:00:00.261) 0:21:53.407 *********** skipping: [managed-node11] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Monday 16 June 2025 17:28:47 -0400 (0:00:00.332) 0:21:53.739 *********** skipping: [managed-node11] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Monday 16 June 2025 17:28:47 -0400 (0:00:00.256) 0:21:53.996 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Monday 16 June 2025 17:28:47 -0400 (0:00:00.259) 0:21:54.255 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Monday 16 June 2025 17:28:47 -0400 (0:00:00.259) 0:21:54.514 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Monday 16 June 2025 17:28:48 -0400 (0:00:00.207) 0:21:54.722 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Monday 16 June 2025 17:28:48 -0400 (0:00:00.302) 0:21:55.025 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Monday 16 June 2025 17:28:48 -0400 (0:00:00.218) 0:21:55.244 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Monday 16 June 2025 17:28:48 -0400 (0:00:00.214) 0:21:55.458 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Monday 16 June 2025 17:28:48 -0400 (0:00:00.161) 0:21:55.619 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Monday 16 June 2025 17:28:49 -0400 (0:00:00.290) 0:21:55.909 *********** skipping: [managed-node11] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Monday 16 June 2025 17:28:49 -0400 (0:00:00.294) 0:21:56.204 *********** skipping: [managed-node11] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Monday 16 June 2025 17:28:49 -0400 (0:00:00.295) 0:21:56.499 *********** skipping: [managed-node11] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Monday 16 June 2025 17:28:50 -0400 (0:00:00.192) 0:21:56.692 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Monday 16 June 2025 17:28:50 -0400 (0:00:00.300) 0:21:56.992 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Monday 16 June 2025 17:28:50 -0400 (0:00:00.258) 0:21:57.251 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Monday 16 June 2025 17:28:50 -0400 (0:00:00.164) 0:21:57.415 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Monday 16 June 2025 17:28:50 -0400 (0:00:00.144) 0:21:57.560 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Monday 16 June 2025 17:28:51 -0400 (0:00:00.113) 0:21:57.674 *********** ok: [managed-node11] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Monday 16 June 2025 17:28:51 -0400 (0:00:00.147) 0:21:57.821 *********** ok: [managed-node11] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Monday 16 June 2025 17:28:51 -0400 (0:00:00.148) 0:21:57.970 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Monday 16 June 2025 17:28:51 -0400 (0:00:00.197) 0:21:58.167 *********** ok: [managed-node11] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.024342", "end": "2025-06-16 17:28:52.486024", "rc": 0, "start": "2025-06-16 17:28:52.461682" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Monday 16 June 2025 17:28:52 -0400 (0:00:01.156) 0:21:59.323 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Monday 16 June 2025 17:28:52 -0400 (0:00:00.117) 0:21:59.441 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Monday 16 June 2025 17:28:52 -0400 (0:00:00.149) 0:21:59.591 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Monday 16 June 2025 17:28:53 -0400 (0:00:00.110) 0:21:59.701 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Monday 16 June 2025 17:28:53 -0400 (0:00:00.140) 0:21:59.842 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Monday 16 June 2025 17:28:53 -0400 (0:00:00.181) 0:22:00.024 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Monday 16 June 2025 17:28:53 -0400 (0:00:00.166) 0:22:00.190 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Monday 16 June 2025 17:28:53 -0400 (0:00:00.087) 0:22:00.277 *********** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Monday 16 June 2025 17:28:53 -0400 (0:00:00.099) 0:22:00.377 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Monday 16 June 2025 17:28:53 -0400 (0:00:00.123) 0:22:00.500 *********** changed: [managed-node11] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:483 Monday 16 June 2025 17:28:55 -0400 (0:00:01.428) 0:22:01.928 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node11 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Monday 16 June 2025 17:28:55 -0400 (0:00:00.535) 0:22:02.464 *********** ok: [managed-node11] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Monday 16 June 2025 17:28:56 -0400 (0:00:00.195) 0:22:02.660 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 16 June 2025 17:28:56 -0400 (0:00:00.275) 0:22:02.936 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 16 June 2025 17:28:57 -0400 (0:00:00.831) 0:22:03.767 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 16 June 2025 17:28:57 -0400 (0:00:00.234) 0:22:04.001 *********** skipping: [managed-node11] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node11] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Monday 16 June 2025 17:28:57 -0400 (0:00:00.623) 0:22:04.625 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Monday 16 June 2025 17:28:58 -0400 (0:00:00.253) 0:22:04.878 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 16 June 2025 17:28:58 -0400 (0:00:00.314) 0:22:05.192 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 16 June 2025 17:28:58 -0400 (0:00:00.270) 0:22:05.463 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 16 June 2025 17:28:59 -0400 (0:00:00.372) 0:22:05.836 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Monday 16 June 2025 17:28:59 -0400 (0:00:00.713) 0:22:06.549 *********** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 16 June 2025 17:29:04 -0400 (0:00:04.931) 0:22:11.481 *********** ok: [managed-node11] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Monday 16 June 2025 17:29:05 -0400 (0:00:00.266) 0:22:11.748 *********** ok: [managed-node11] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Monday 16 June 2025 17:29:05 -0400 (0:00:00.225) 0:22:11.973 *********** ok: [managed-node11] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 16 June 2025 17:29:10 -0400 (0:00:05.347) 0:22:17.321 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 16 June 2025 17:29:11 -0400 (0:00:00.482) 0:22:17.803 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 16 June 2025 17:29:11 -0400 (0:00:00.241) 0:22:18.044 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 16 June 2025 17:29:11 -0400 (0:00:00.261) 0:22:18.306 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Monday 16 June 2025 17:29:11 -0400 (0:00:00.316) 0:22:18.622 *********** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Monday 16 June 2025 17:29:17 -0400 (0:00:05.039) 0:22:23.662 *********** ok: [managed-node11] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service": { "name": "systemd-cryptsetup@luk...d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2d81fa0ad9\\x2d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service": { "name": "systemd-cryptsetup@luks\\x2d81fa0ad9\\x2d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Monday 16 June 2025 17:29:20 -0400 (0:00:03.086) 0:22:26.748 *********** ok: [managed-node11] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d81fa0ad9\\x2d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "systemd-cryptsetup@luk...d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Monday 16 June 2025 17:29:20 -0400 (0:00:00.386) 0:22:27.135 *********** changed: [managed-node11] => (item=systemd-cryptsetup@luks\x2d81fa0ad9\x2d7bd0\x2d4113\x2d88f7\x2d8e5459e0d1d0.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d81fa0ad9\\x2d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "name": "systemd-cryptsetup@luks\\x2d81fa0ad9\\x2d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "dev-mapper-foo\\x2dtest1.device system-systemd\\x2dcryptsetup.slice systemd-journald.socket cryptsetup-pre.target", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-mapper-foo\\x2dtest1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0 /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d81fa0ad9\\x2d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d81fa0ad9\\x2d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d81fa0ad9\\x2d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Mon 2025-06-16 17:24:26 EDT", "StateChangeTimestampMonotonic": "2702198719", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node11] => (item=systemd-cryptsetup@luk...d7bd0\x2d4113\x2d88f7\x2d8e5459e0d1d0.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "name": "systemd-cryptsetup@luk...d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Monday 16 June 2025 17:29:23 -0400 (0:00:02.837) 0:22:29.972 *********** fatal: [managed-node11]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'test1' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Monday 16 June 2025 17:29:28 -0400 (0:00:05.535) 0:22:35.507 *********** fatal: [managed-node11]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'test1' in safe mode due to adding encryption", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 16 June 2025 17:29:29 -0400 (0:00:00.279) 0:22:35.786 *********** changed: [managed-node11] => (item=systemd-cryptsetup@luks\x2d81fa0ad9\x2d7bd0\x2d4113\x2d88f7\x2d8e5459e0d1d0.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d81fa0ad9\\x2d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "name": "systemd-cryptsetup@luks\\x2d81fa0ad9\\x2d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d81fa0ad9\\x2d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d81fa0ad9\\x2d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d81fa0ad9\\x2d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d81fa0ad9\\x2d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node11] => (item=systemd-cryptsetup@luk...d7bd0\x2d4113\x2d88f7\x2d8e5459e0d1d0.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "name": "systemd-cryptsetup@luk...d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d7bd0\\x2d4113\\x2d88f7\\x2d8e5459e0d1d0.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Monday 16 June 2025 17:29:32 -0400 (0:00:03.556) 0:22:39.343 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Monday 16 June 2025 17:29:33 -0400 (0:00:00.357) 0:22:39.700 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Monday 16 June 2025 17:29:33 -0400 (0:00:00.333) 0:22:40.033 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Monday 16 June 2025 17:29:34 -0400 (0:00:00.839) 0:22:40.873 *********** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1750109335.0895667, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1750109335.0895667, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1750109335.0895667, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "1948499703", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Monday 16 June 2025 17:29:35 -0400 (0:00:01.058) 0:22:41.931 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Add encryption to the volume] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:507 Monday 16 June 2025 17:29:35 -0400 (0:00:00.240) 0:22:42.171 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 16 June 2025 17:29:36 -0400 (0:00:00.604) 0:22:42.776 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 16 June 2025 17:29:36 -0400 (0:00:00.365) 0:22:43.141 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 16 June 2025 17:29:36 -0400 (0:00:00.257) 0:22:43.399 *********** skipping: [managed-node11] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node11] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Monday 16 June 2025 17:29:37 -0400 (0:00:00.596) 0:22:43.995 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Monday 16 June 2025 17:29:37 -0400 (0:00:00.214) 0:22:44.210 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 16 June 2025 17:29:37 -0400 (0:00:00.364) 0:22:44.574 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 16 June 2025 17:29:38 -0400 (0:00:00.218) 0:22:44.793 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 16 June 2025 17:29:38 -0400 (0:00:00.141) 0:22:44.934 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Monday 16 June 2025 17:29:38 -0400 (0:00:00.412) 0:22:45.347 *********** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 16 June 2025 17:29:43 -0400 (0:00:04.760) 0:22:50.107 *********** ok: [managed-node11] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Monday 16 June 2025 17:29:43 -0400 (0:00:00.204) 0:22:50.311 *********** ok: [managed-node11] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Monday 16 June 2025 17:29:44 -0400 (0:00:00.326) 0:22:50.638 *********** ok: [managed-node11] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 16 June 2025 17:29:49 -0400 (0:00:05.402) 0:22:56.041 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 16 June 2025 17:29:49 -0400 (0:00:00.489) 0:22:56.531 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 16 June 2025 17:29:50 -0400 (0:00:00.202) 0:22:56.733 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 16 June 2025 17:29:50 -0400 (0:00:00.228) 0:22:56.961 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Monday 16 June 2025 17:29:50 -0400 (0:00:00.184) 0:22:57.146 *********** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Monday 16 June 2025 17:29:55 -0400 (0:00:04.745) 0:23:01.891 *********** ok: [managed-node11] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Monday 16 June 2025 17:29:57 -0400 (0:00:02.668) 0:23:04.560 *********** ok: [managed-node11] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Monday 16 June 2025 17:29:58 -0400 (0:00:00.378) 0:23:04.938 *********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Monday 16 June 2025 17:29:58 -0400 (0:00:00.144) 0:23:05.082 *********** changed: [managed-node11] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-7ed396ae-2a37-4530-997a-8f9a8e561b91", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-7ed396ae-2a37-4530-997a-8f9a8e561b91", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-7ed396ae-2a37-4530-997a-8f9a8e561b91", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-7ed396ae-2a37-4530-997a-8f9a8e561b91" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-7ed396ae-2a37-4530-997a-8f9a8e561b91", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-7ed396ae-2a37-4530-997a-8f9a8e561b91", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-7ed396ae-2a37-4530-997a-8f9a8e561b91", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Monday 16 June 2025 17:30:12 -0400 (0:00:14.042) 0:23:19.124 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Monday 16 June 2025 17:30:12 -0400 (0:00:00.252) 0:23:19.377 *********** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1750109256.1159332, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "a1522684f5b6a445a50f2611a4e0757a4aec1cf1", "ctime": 1750109256.1129332, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 364904586, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1750109256.1129332, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1393, "uid": 0, "version": "4217545767", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Monday 16 June 2025 17:30:14 -0400 (0:00:01.667) 0:23:21.044 *********** ok: [managed-node11] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 16 June 2025 17:30:16 -0400 (0:00:01.810) 0:23:22.855 *********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Monday 16 June 2025 17:30:16 -0400 (0:00:00.180) 0:23:23.036 *********** ok: [managed-node11] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-7ed396ae-2a37-4530-997a-8f9a8e561b91", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-7ed396ae-2a37-4530-997a-8f9a8e561b91", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-7ed396ae-2a37-4530-997a-8f9a8e561b91", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-7ed396ae-2a37-4530-997a-8f9a8e561b91" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-7ed396ae-2a37-4530-997a-8f9a8e561b91", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-7ed396ae-2a37-4530-997a-8f9a8e561b91", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-7ed396ae-2a37-4530-997a-8f9a8e561b91", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Monday 16 June 2025 17:30:16 -0400 (0:00:00.326) 0:23:23.362 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-7ed396ae-2a37-4530-997a-8f9a8e561b91", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-7ed396ae-2a37-4530-997a-8f9a8e561b91", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Monday 16 June 2025 17:30:16 -0400 (0:00:00.166) 0:23:23.529 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Monday 16 June 2025 17:30:17 -0400 (0:00:00.220) 0:23:23.750 *********** changed: [managed-node11] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Monday 16 June 2025 17:30:18 -0400 (0:00:01.526) 0:23:25.276 *********** ok: [managed-node11] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Monday 16 June 2025 17:30:20 -0400 (0:00:01.530) 0:23:26.807 *********** changed: [managed-node11] => (item={'src': '/dev/mapper/luks-7ed396ae-2a37-4530-997a-8f9a8e561b91', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-7ed396ae-2a37-4530-997a-8f9a8e561b91", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-7ed396ae-2a37-4530-997a-8f9a8e561b91" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Monday 16 June 2025 17:30:21 -0400 (0:00:01.712) 0:23:28.519 *********** skipping: [managed-node11] => (item={'src': '/dev/mapper/luks-7ed396ae-2a37-4530-997a-8f9a8e561b91', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-7ed396ae-2a37-4530-997a-8f9a8e561b91", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Monday 16 June 2025 17:30:22 -0400 (0:00:00.382) 0:23:28.902 *********** ok: [managed-node11] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Monday 16 June 2025 17:30:24 -0400 (0:00:01.786) 0:23:30.688 *********** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1750109269.192038, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1750109261.0119724, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 245366983, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1750109261.0109725, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "299109841", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Monday 16 June 2025 17:30:25 -0400 (0:00:01.353) 0:23:32.042 *********** changed: [managed-node11] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-7ed396ae-2a37-4530-997a-8f9a8e561b91', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-7ed396ae-2a37-4530-997a-8f9a8e561b91", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Monday 16 June 2025 17:30:27 -0400 (0:00:01.748) 0:23:33.790 *********** ok: [managed-node11] TASK [Verify role results] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:524 Monday 16 June 2025 17:30:29 -0400 (0:00:02.036) 0:23:35.827 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node11 TASK [Print out pool information] ********************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Monday 16 June 2025 17:30:29 -0400 (0:00:00.628) 0:23:36.456 *********** ok: [managed-node11] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-7ed396ae-2a37-4530-997a-8f9a8e561b91", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-7ed396ae-2a37-4530-997a-8f9a8e561b91", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Monday 16 June 2025 17:30:30 -0400 (0:00:00.414) 0:23:36.870 *********** skipping: [managed-node11] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Monday 16 June 2025 17:30:30 -0400 (0:00:00.267) 0:23:37.138 *********** ok: [managed-node11] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "7ed396ae-2a37-4530-997a-8f9a8e561b91" }, "/dev/mapper/luks-7ed396ae-2a37-4530-997a-8f9a8e561b91": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-7ed396ae-2a37-4530-997a-8f9a8e561b91", "size": "4G", "type": "crypt", "uuid": "ac18fcad-7573-4b19-9152-e4d89f2ee440" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "al0uCf-R2qd-eaR5-lcbe-8d4g-1JeN-2yYBO1" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Monday 16 June 2025 17:30:32 -0400 (0:00:01.531) 0:23:38.670 *********** ok: [managed-node11] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002651", "end": "2025-06-16 17:30:33.363431", "rc": 0, "start": "2025-06-16 17:30:33.360780" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-7ed396ae-2a37-4530-997a-8f9a8e561b91 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Monday 16 June 2025 17:30:33 -0400 (0:00:01.633) 0:23:40.303 *********** ok: [managed-node11] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002562", "end": "2025-06-16 17:30:35.127926", "failed_when_result": false, "rc": 0, "start": "2025-06-16 17:30:35.125364" } STDOUT: luks-7ed396ae-2a37-4530-997a-8f9a8e561b91 /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Monday 16 June 2025 17:30:35 -0400 (0:00:01.785) 0:23:42.089 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node11 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Monday 16 June 2025 17:30:35 -0400 (0:00:00.491) 0:23:42.580 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Monday 16 June 2025 17:30:36 -0400 (0:00:00.209) 0:23:42.790 *********** ok: [managed-node11] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.021346", "end": "2025-06-16 17:30:37.391759", "rc": 0, "start": "2025-06-16 17:30:37.370413" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Monday 16 June 2025 17:30:37 -0400 (0:00:01.536) 0:23:44.327 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Monday 16 June 2025 17:30:38 -0400 (0:00:00.416) 0:23:44.743 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node11 TASK [Set test variables] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Monday 16 June 2025 17:30:38 -0400 (0:00:00.616) 0:23:45.360 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Monday 16 June 2025 17:30:39 -0400 (0:00:00.399) 0:23:45.759 *********** ok: [managed-node11] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Monday 16 June 2025 17:30:40 -0400 (0:00:01.494) 0:23:47.253 *********** ok: [managed-node11] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Monday 16 June 2025 17:30:40 -0400 (0:00:00.208) 0:23:47.462 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Monday 16 June 2025 17:30:41 -0400 (0:00:00.285) 0:23:47.747 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Monday 16 June 2025 17:30:41 -0400 (0:00:00.792) 0:23:48.543 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Monday 16 June 2025 17:30:42 -0400 (0:00:00.302) 0:23:48.846 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Monday 16 June 2025 17:30:42 -0400 (0:00:00.317) 0:23:49.163 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:54 Monday 16 June 2025 17:30:42 -0400 (0:00:00.228) 0:23:49.391 *********** ok: [managed-node11] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:67 Monday 16 June 2025 17:30:43 -0400 (0:00:00.292) 0:23:49.684 *********** ok: [managed-node11] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.14.162 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:77 Monday 16 June 2025 17:30:44 -0400 (0:00:01.655) 0:23:51.340 *********** skipping: [managed-node11] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:87 Monday 16 June 2025 17:30:44 -0400 (0:00:00.222) 0:23:51.562 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node11 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Monday 16 June 2025 17:30:45 -0400 (0:00:00.598) 0:23:52.160 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Monday 16 June 2025 17:30:45 -0400 (0:00:00.224) 0:23:52.385 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Monday 16 June 2025 17:30:46 -0400 (0:00:00.340) 0:23:52.726 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Monday 16 June 2025 17:30:46 -0400 (0:00:00.428) 0:23:53.154 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Monday 16 June 2025 17:30:46 -0400 (0:00:00.193) 0:23:53.348 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Monday 16 June 2025 17:30:47 -0400 (0:00:00.323) 0:23:53.671 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Monday 16 June 2025 17:30:47 -0400 (0:00:00.321) 0:23:53.992 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Monday 16 June 2025 17:30:47 -0400 (0:00:00.228) 0:23:54.221 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Monday 16 June 2025 17:30:47 -0400 (0:00:00.240) 0:23:54.461 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Monday 16 June 2025 17:30:48 -0400 (0:00:00.250) 0:23:54.712 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Monday 16 June 2025 17:30:48 -0400 (0:00:00.303) 0:23:55.016 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:90 Monday 16 June 2025 17:30:48 -0400 (0:00:00.208) 0:23:55.224 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node11 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Monday 16 June 2025 17:30:49 -0400 (0:00:00.543) 0:23:55.767 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node11 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Monday 16 June 2025 17:30:49 -0400 (0:00:00.434) 0:23:56.202 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Monday 16 June 2025 17:30:49 -0400 (0:00:00.214) 0:23:56.416 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Monday 16 June 2025 17:30:50 -0400 (0:00:00.313) 0:23:56.729 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Monday 16 June 2025 17:30:50 -0400 (0:00:00.325) 0:23:57.055 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Monday 16 June 2025 17:30:50 -0400 (0:00:00.263) 0:23:57.318 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Monday 16 June 2025 17:30:50 -0400 (0:00:00.265) 0:23:57.584 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Monday 16 June 2025 17:30:51 -0400 (0:00:00.271) 0:23:57.856 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:93 Monday 16 June 2025 17:30:51 -0400 (0:00:00.244) 0:23:58.101 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node11 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Monday 16 June 2025 17:30:52 -0400 (0:00:00.531) 0:23:58.632 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node11 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Monday 16 June 2025 17:30:52 -0400 (0:00:00.426) 0:23:59.059 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Monday 16 June 2025 17:30:52 -0400 (0:00:00.087) 0:23:59.147 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Monday 16 June 2025 17:30:52 -0400 (0:00:00.065) 0:23:59.213 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Monday 16 June 2025 17:30:52 -0400 (0:00:00.101) 0:23:59.314 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:96 Monday 16 June 2025 17:30:52 -0400 (0:00:00.101) 0:23:59.415 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node11 TASK [Set test variables] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Monday 16 June 2025 17:30:53 -0400 (0:00:00.460) 0:23:59.876 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Monday 16 June 2025 17:30:53 -0400 (0:00:00.140) 0:24:00.016 *********** skipping: [managed-node11] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Monday 16 June 2025 17:30:53 -0400 (0:00:00.261) 0:24:00.277 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node11 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Monday 16 June 2025 17:30:54 -0400 (0:00:00.358) 0:24:00.636 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Monday 16 June 2025 17:30:54 -0400 (0:00:00.198) 0:24:00.834 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Monday 16 June 2025 17:30:54 -0400 (0:00:00.158) 0:24:00.993 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Monday 16 June 2025 17:30:54 -0400 (0:00:00.258) 0:24:01.251 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Monday 16 June 2025 17:30:54 -0400 (0:00:00.048) 0:24:01.300 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Monday 16 June 2025 17:30:54 -0400 (0:00:00.051) 0:24:01.351 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Monday 16 June 2025 17:30:54 -0400 (0:00:00.120) 0:24:01.471 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:99 Monday 16 June 2025 17:30:55 -0400 (0:00:00.219) 0:24:01.691 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node11 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Monday 16 June 2025 17:30:55 -0400 (0:00:00.301) 0:24:01.993 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node11 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Monday 16 June 2025 17:30:55 -0400 (0:00:00.282) 0:24:02.275 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Monday 16 June 2025 17:30:56 -0400 (0:00:00.376) 0:24:02.652 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Monday 16 June 2025 17:30:56 -0400 (0:00:00.311) 0:24:02.963 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Monday 16 June 2025 17:30:56 -0400 (0:00:00.301) 0:24:03.265 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Monday 16 June 2025 17:30:56 -0400 (0:00:00.339) 0:24:03.604 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Monday 16 June 2025 17:30:57 -0400 (0:00:00.215) 0:24:03.820 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Monday 16 June 2025 17:30:57 -0400 (0:00:00.261) 0:24:04.081 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:102 Monday 16 June 2025 17:30:57 -0400 (0:00:00.285) 0:24:04.366 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node11 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Monday 16 June 2025 17:30:58 -0400 (0:00:00.691) 0:24:05.058 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Monday 16 June 2025 17:30:58 -0400 (0:00:00.294) 0:24:05.352 *********** skipping: [managed-node11] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Monday 16 June 2025 17:30:59 -0400 (0:00:00.334) 0:24:05.687 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Monday 16 June 2025 17:30:59 -0400 (0:00:00.278) 0:24:05.965 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Monday 16 June 2025 17:30:59 -0400 (0:00:00.300) 0:24:06.266 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Monday 16 June 2025 17:30:59 -0400 (0:00:00.302) 0:24:06.569 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Monday 16 June 2025 17:31:00 -0400 (0:00:00.286) 0:24:06.855 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:105 Monday 16 June 2025 17:31:00 -0400 (0:00:00.235) 0:24:07.091 *********** ok: [managed-node11] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Monday 16 June 2025 17:31:00 -0400 (0:00:00.314) 0:24:07.405 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node11 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Monday 16 June 2025 17:31:01 -0400 (0:00:00.546) 0:24:07.952 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Monday 16 June 2025 17:31:01 -0400 (0:00:00.394) 0:24:08.346 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node11 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Monday 16 June 2025 17:31:03 -0400 (0:00:01.417) 0:24:09.764 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-7ed396ae-2a37-4530-997a-8f9a8e561b91" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Monday 16 June 2025 17:31:03 -0400 (0:00:00.468) 0:24:10.232 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Monday 16 June 2025 17:31:03 -0400 (0:00:00.264) 0:24:10.496 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Monday 16 June 2025 17:31:04 -0400 (0:00:00.217) 0:24:10.714 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Monday 16 June 2025 17:31:04 -0400 (0:00:00.277) 0:24:10.992 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Monday 16 June 2025 17:31:04 -0400 (0:00:00.272) 0:24:11.264 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Monday 16 June 2025 17:31:04 -0400 (0:00:00.314) 0:24:11.578 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Monday 16 June 2025 17:31:05 -0400 (0:00:00.291) 0:24:11.869 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Monday 16 June 2025 17:31:05 -0400 (0:00:00.201) 0:24:12.071 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Monday 16 June 2025 17:31:05 -0400 (0:00:00.193) 0:24:12.264 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Monday 16 June 2025 17:31:05 -0400 (0:00:00.227) 0:24:12.491 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Monday 16 June 2025 17:31:06 -0400 (0:00:00.251) 0:24:12.743 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-7ed396ae-2a37-4530-997a-8f9a8e561b91 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Monday 16 June 2025 17:31:06 -0400 (0:00:00.452) 0:24:13.195 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Monday 16 June 2025 17:31:06 -0400 (0:00:00.256) 0:24:13.451 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Monday 16 June 2025 17:31:07 -0400 (0:00:00.225) 0:24:13.676 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Monday 16 June 2025 17:31:07 -0400 (0:00:00.386) 0:24:14.063 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Monday 16 June 2025 17:31:07 -0400 (0:00:00.395) 0:24:14.459 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Monday 16 June 2025 17:31:08 -0400 (0:00:00.242) 0:24:14.701 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Monday 16 June 2025 17:31:09 -0400 (0:00:00.998) 0:24:15.700 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Monday 16 June 2025 17:31:09 -0400 (0:00:00.332) 0:24:16.033 *********** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1750109412.1091845, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1750109412.1091845, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 259938, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1750109412.1091845, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Monday 16 June 2025 17:31:11 -0400 (0:00:01.620) 0:24:17.653 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Monday 16 June 2025 17:31:11 -0400 (0:00:00.320) 0:24:17.974 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Monday 16 June 2025 17:31:11 -0400 (0:00:00.351) 0:24:18.326 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Monday 16 June 2025 17:31:11 -0400 (0:00:00.303) 0:24:18.629 *********** ok: [managed-node11] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Monday 16 June 2025 17:31:12 -0400 (0:00:00.318) 0:24:18.948 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Monday 16 June 2025 17:31:12 -0400 (0:00:00.171) 0:24:19.119 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Monday 16 June 2025 17:31:12 -0400 (0:00:00.227) 0:24:19.347 *********** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1750109412.2621858, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1750109412.2621858, "dev": 6, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 277860, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1750109412.2621858, "nlink": 1, "path": "/dev/mapper/luks-7ed396ae-2a37-4530-997a-8f9a8e561b91", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Monday 16 June 2025 17:31:14 -0400 (0:00:01.554) 0:24:20.902 *********** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Monday 16 June 2025 17:31:18 -0400 (0:00:04.602) 0:24:25.505 *********** ok: [managed-node11] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.010471", "end": "2025-06-16 17:31:20.243849", "rc": 0, "start": "2025-06-16 17:31:20.233378" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 7ed396ae-2a37-4530-997a-8f9a8e561b91 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 919618 Threads: 2 Salt: 65 b1 ea 7e d1 37 87 60 20 65 1c 81 c2 6d 52 06 b2 21 86 93 55 c7 56 38 53 99 bf ea e7 c3 58 f4 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 120470 Salt: 4d 99 c0 8e 4f 55 37 62 f1 c2 27 cd 8d c6 15 3c e4 85 73 62 d9 41 9d c1 ce 8b b9 a9 4a 12 f7 58 Digest: 69 0c 88 25 ef 26 16 a0 8b b9 ea cd f9 f5 2c 77 97 c7 c4 ce 6e 5a 22 53 53 af 89 77 31 db a1 b4 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Monday 16 June 2025 17:31:20 -0400 (0:00:01.716) 0:24:27.221 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Monday 16 June 2025 17:31:20 -0400 (0:00:00.382) 0:24:27.604 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Monday 16 June 2025 17:31:21 -0400 (0:00:00.347) 0:24:27.951 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Monday 16 June 2025 17:31:21 -0400 (0:00:00.274) 0:24:28.226 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Monday 16 June 2025 17:31:21 -0400 (0:00:00.231) 0:24:28.458 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Monday 16 June 2025 17:31:22 -0400 (0:00:00.438) 0:24:28.896 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Monday 16 June 2025 17:31:22 -0400 (0:00:00.299) 0:24:29.196 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Monday 16 June 2025 17:31:22 -0400 (0:00:00.258) 0:24:29.454 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-7ed396ae-2a37-4530-997a-8f9a8e561b91 /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Monday 16 June 2025 17:31:23 -0400 (0:00:00.318) 0:24:29.772 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Monday 16 June 2025 17:31:23 -0400 (0:00:00.340) 0:24:30.113 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Monday 16 June 2025 17:31:23 -0400 (0:00:00.259) 0:24:30.373 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Monday 16 June 2025 17:31:24 -0400 (0:00:00.348) 0:24:30.721 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Monday 16 June 2025 17:31:24 -0400 (0:00:00.373) 0:24:31.095 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Monday 16 June 2025 17:31:24 -0400 (0:00:00.253) 0:24:31.348 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Monday 16 June 2025 17:31:25 -0400 (0:00:00.319) 0:24:31.668 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Monday 16 June 2025 17:31:25 -0400 (0:00:00.259) 0:24:31.928 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Monday 16 June 2025 17:31:25 -0400 (0:00:00.222) 0:24:32.151 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Monday 16 June 2025 17:31:25 -0400 (0:00:00.243) 0:24:32.394 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Monday 16 June 2025 17:31:26 -0400 (0:00:00.251) 0:24:32.646 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Monday 16 June 2025 17:31:26 -0400 (0:00:00.220) 0:24:32.866 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Monday 16 June 2025 17:31:26 -0400 (0:00:00.227) 0:24:33.094 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Monday 16 June 2025 17:31:26 -0400 (0:00:00.265) 0:24:33.360 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Monday 16 June 2025 17:31:26 -0400 (0:00:00.234) 0:24:33.594 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Monday 16 June 2025 17:31:27 -0400 (0:00:00.252) 0:24:33.847 *********** ok: [managed-node11] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Monday 16 June 2025 17:31:28 -0400 (0:00:01.598) 0:24:35.445 *********** ok: [managed-node11] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Monday 16 June 2025 17:31:30 -0400 (0:00:01.777) 0:24:37.223 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Monday 16 June 2025 17:31:30 -0400 (0:00:00.289) 0:24:37.513 *********** ok: [managed-node11] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Monday 16 June 2025 17:31:31 -0400 (0:00:00.179) 0:24:37.693 *********** ok: [managed-node11] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Monday 16 June 2025 17:31:32 -0400 (0:00:01.672) 0:24:39.365 *********** skipping: [managed-node11] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Monday 16 June 2025 17:31:32 -0400 (0:00:00.261) 0:24:39.627 *********** skipping: [managed-node11] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Monday 16 June 2025 17:31:33 -0400 (0:00:00.181) 0:24:39.809 *********** skipping: [managed-node11] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Monday 16 June 2025 17:31:33 -0400 (0:00:00.241) 0:24:40.051 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Monday 16 June 2025 17:31:33 -0400 (0:00:00.227) 0:24:40.278 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Monday 16 June 2025 17:31:33 -0400 (0:00:00.293) 0:24:40.571 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Monday 16 June 2025 17:31:34 -0400 (0:00:00.255) 0:24:40.827 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Monday 16 June 2025 17:31:34 -0400 (0:00:00.243) 0:24:41.070 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Monday 16 June 2025 17:31:34 -0400 (0:00:00.226) 0:24:41.297 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Monday 16 June 2025 17:31:34 -0400 (0:00:00.270) 0:24:41.567 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Monday 16 June 2025 17:31:35 -0400 (0:00:00.343) 0:24:41.910 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Monday 16 June 2025 17:31:35 -0400 (0:00:00.279) 0:24:42.190 *********** skipping: [managed-node11] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Monday 16 June 2025 17:31:35 -0400 (0:00:00.362) 0:24:42.553 *********** skipping: [managed-node11] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Monday 16 June 2025 17:31:36 -0400 (0:00:00.248) 0:24:42.802 *********** skipping: [managed-node11] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Monday 16 June 2025 17:31:36 -0400 (0:00:00.310) 0:24:43.113 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Monday 16 June 2025 17:31:36 -0400 (0:00:00.293) 0:24:43.406 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Monday 16 June 2025 17:31:37 -0400 (0:00:00.906) 0:24:44.312 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Monday 16 June 2025 17:31:37 -0400 (0:00:00.262) 0:24:44.574 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Monday 16 June 2025 17:31:38 -0400 (0:00:00.305) 0:24:44.880 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Monday 16 June 2025 17:31:38 -0400 (0:00:00.224) 0:24:45.105 *********** ok: [managed-node11] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Monday 16 June 2025 17:31:38 -0400 (0:00:00.267) 0:24:45.373 *********** ok: [managed-node11] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Monday 16 June 2025 17:31:39 -0400 (0:00:00.297) 0:24:45.670 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Monday 16 June 2025 17:31:39 -0400 (0:00:00.329) 0:24:45.999 *********** ok: [managed-node11] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.025003", "end": "2025-06-16 17:31:40.827673", "rc": 0, "start": "2025-06-16 17:31:40.802670" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Monday 16 June 2025 17:31:41 -0400 (0:00:01.766) 0:24:47.766 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Monday 16 June 2025 17:31:41 -0400 (0:00:00.359) 0:24:48.125 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Monday 16 June 2025 17:31:41 -0400 (0:00:00.428) 0:24:48.554 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Monday 16 June 2025 17:31:42 -0400 (0:00:00.329) 0:24:48.883 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Monday 16 June 2025 17:31:42 -0400 (0:00:00.320) 0:24:49.204 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Monday 16 June 2025 17:31:42 -0400 (0:00:00.284) 0:24:49.488 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Monday 16 June 2025 17:31:43 -0400 (0:00:00.248) 0:24:49.737 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Monday 16 June 2025 17:31:43 -0400 (0:00:00.189) 0:24:49.927 *********** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Monday 16 June 2025 17:31:43 -0400 (0:00:00.205) 0:24:50.132 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:527 Monday 16 June 2025 17:31:43 -0400 (0:00:00.276) 0:24:50.409 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 16 June 2025 17:31:44 -0400 (0:00:00.936) 0:24:51.345 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 16 June 2025 17:31:45 -0400 (0:00:00.353) 0:24:51.698 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 16 June 2025 17:31:45 -0400 (0:00:00.393) 0:24:52.091 *********** skipping: [managed-node11] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node11] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node11] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Monday 16 June 2025 17:31:46 -0400 (0:00:00.716) 0:24:52.808 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Monday 16 June 2025 17:31:46 -0400 (0:00:00.385) 0:24:53.193 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 16 June 2025 17:31:46 -0400 (0:00:00.266) 0:24:53.460 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 16 June 2025 17:31:47 -0400 (0:00:00.174) 0:24:53.635 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 16 June 2025 17:31:47 -0400 (0:00:00.271) 0:24:53.906 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Monday 16 June 2025 17:31:47 -0400 (0:00:00.575) 0:24:54.481 *********** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 16 June 2025 17:31:52 -0400 (0:00:04.859) 0:24:59.341 *********** ok: [managed-node11] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Monday 16 June 2025 17:31:53 -0400 (0:00:00.308) 0:24:59.650 *********** ok: [managed-node11] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "name": "foo", "state": "absent", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Monday 16 June 2025 17:31:53 -0400 (0:00:00.334) 0:24:59.984 *********** ok: [managed-node11] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 16 June 2025 17:31:58 -0400 (0:00:05.531) 0:25:05.515 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node11 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 16 June 2025 17:31:59 -0400 (0:00:00.487) 0:25:06.003 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 16 June 2025 17:31:59 -0400 (0:00:00.279) 0:25:06.283 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 16 June 2025 17:31:59 -0400 (0:00:00.316) 0:25:06.599 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Monday 16 June 2025 17:32:00 -0400 (0:00:00.231) 0:25:06.831 *********** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Monday 16 June 2025 17:32:04 -0400 (0:00:04.521) 0:25:11.353 *********** ok: [managed-node11] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Monday 16 June 2025 17:32:07 -0400 (0:00:03.019) 0:25:14.372 *********** ok: [managed-node11] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Monday 16 June 2025 17:32:08 -0400 (0:00:00.431) 0:25:14.804 *********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Monday 16 June 2025 17:32:08 -0400 (0:00:00.219) 0:25:15.024 *********** changed: [managed-node11] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-7ed396ae-2a37-4530-997a-8f9a8e561b91", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-7ed396ae-2a37-4530-997a-8f9a8e561b91", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-7ed396ae-2a37-4530-997a-8f9a8e561b91", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-7ed396ae-2a37-4530-997a-8f9a8e561b91", "state": "absent" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=al0uCf-R2qd-eaR5-lcbe-8d4g-1JeN-2yYBO1", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Monday 16 June 2025 17:32:14 -0400 (0:00:05.923) 0:25:20.947 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Monday 16 June 2025 17:32:14 -0400 (0:00:00.229) 0:25:21.176 *********** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1750109421.5812604, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "3edfdb817799461e1ab71e029e0c0c9f8417ea58", "ctime": 1750109421.5782604, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 364904586, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1750109421.5782604, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "4217545767", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Monday 16 June 2025 17:32:16 -0400 (0:00:01.625) 0:25:22.801 *********** ok: [managed-node11] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 16 June 2025 17:32:17 -0400 (0:00:01.484) 0:25:24.286 *********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Monday 16 June 2025 17:32:17 -0400 (0:00:00.209) 0:25:24.495 *********** ok: [managed-node11] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-7ed396ae-2a37-4530-997a-8f9a8e561b91", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-7ed396ae-2a37-4530-997a-8f9a8e561b91", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-7ed396ae-2a37-4530-997a-8f9a8e561b91", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-7ed396ae-2a37-4530-997a-8f9a8e561b91", "state": "absent" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=al0uCf-R2qd-eaR5-lcbe-8d4g-1JeN-2yYBO1", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Monday 16 June 2025 17:32:18 -0400 (0:00:00.151) 0:25:24.647 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Monday 16 June 2025 17:32:18 -0400 (0:00:00.115) 0:25:24.762 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=al0uCf-R2qd-eaR5-lcbe-8d4g-1JeN-2yYBO1", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Monday 16 June 2025 17:32:18 -0400 (0:00:00.215) 0:25:24.977 *********** changed: [managed-node11] => (item={'src': '/dev/mapper/luks-7ed396ae-2a37-4530-997a-8f9a8e561b91', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-7ed396ae-2a37-4530-997a-8f9a8e561b91", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-7ed396ae-2a37-4530-997a-8f9a8e561b91" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Monday 16 June 2025 17:32:20 -0400 (0:00:01.835) 0:25:26.812 *********** ok: [managed-node11] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Monday 16 June 2025 17:32:22 -0400 (0:00:01.953) 0:25:28.766 *********** TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Monday 16 June 2025 17:32:22 -0400 (0:00:00.266) 0:25:29.033 *********** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Monday 16 June 2025 17:32:22 -0400 (0:00:00.228) 0:25:29.261 *********** ok: [managed-node11] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Monday 16 June 2025 17:32:24 -0400 (0:00:01.991) 0:25:31.253 *********** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1750109435.1273692, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "14548c9b5814db3393299e6f8643c46a4168f093", "ctime": 1750109426.878303, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 400556168, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1750109426.876303, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "1422660517", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Monday 16 June 2025 17:32:26 -0400 (0:00:01.424) 0:25:32.677 *********** changed: [managed-node11] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-7ed396ae-2a37-4530-997a-8f9a8e561b91', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-7ed396ae-2a37-4530-997a-8f9a8e561b91", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Monday 16 June 2025 17:32:27 -0400 (0:00:01.681) 0:25:34.359 *********** ok: [managed-node11] TASK [Verify role results] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:537 Monday 16 June 2025 17:32:29 -0400 (0:00:01.938) 0:25:36.297 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node11 TASK [Print out pool information] ********************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Monday 16 June 2025 17:32:30 -0400 (0:00:00.690) 0:25:36.987 *********** skipping: [managed-node11] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Monday 16 June 2025 17:32:30 -0400 (0:00:00.159) 0:25:37.147 *********** ok: [managed-node11] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=al0uCf-R2qd-eaR5-lcbe-8d4g-1JeN-2yYBO1", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Monday 16 June 2025 17:32:30 -0400 (0:00:00.206) 0:25:37.354 *********** ok: [managed-node11] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Monday 16 June 2025 17:32:32 -0400 (0:00:01.625) 0:25:38.979 *********** ok: [managed-node11] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:01.004306", "end": "2025-06-16 17:32:34.610013", "rc": 0, "start": "2025-06-16 17:32:33.605707" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Monday 16 June 2025 17:32:34 -0400 (0:00:02.599) 0:25:41.579 *********** ok: [managed-node11] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002886", "end": "2025-06-16 17:32:35.975095", "failed_when_result": false, "rc": 0, "start": "2025-06-16 17:32:35.972209" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Monday 16 June 2025 17:32:36 -0400 (0:00:01.330) 0:25:42.909 *********** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Monday 16 June 2025 17:32:36 -0400 (0:00:00.255) 0:25:43.165 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node11 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Monday 16 June 2025 17:32:37 -0400 (0:00:00.478) 0:25:43.643 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Monday 16 June 2025 17:32:37 -0400 (0:00:00.298) 0:25:43.942 *********** included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node11 included: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node11 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Monday 16 June 2025 17:32:38 -0400 (0:00:01.578) 0:25:45.520 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Monday 16 June 2025 17:32:39 -0400 (0:00:00.313) 0:25:45.834 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Monday 16 June 2025 17:32:39 -0400 (0:00:00.373) 0:25:46.207 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Monday 16 June 2025 17:32:39 -0400 (0:00:00.230) 0:25:46.438 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Monday 16 June 2025 17:32:39 -0400 (0:00:00.107) 0:25:46.545 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Monday 16 June 2025 17:32:40 -0400 (0:00:00.194) 0:25:46.739 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Monday 16 June 2025 17:32:40 -0400 (0:00:00.217) 0:25:46.957 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Monday 16 June 2025 17:32:40 -0400 (0:00:00.100) 0:25:47.057 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Monday 16 June 2025 17:32:40 -0400 (0:00:00.152) 0:25:47.210 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Monday 16 June 2025 17:32:40 -0400 (0:00:00.096) 0:25:47.306 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Monday 16 June 2025 17:32:40 -0400 (0:00:00.127) 0:25:47.434 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Monday 16 June 2025 17:32:40 -0400 (0:00:00.113) 0:25:47.548 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Monday 16 June 2025 17:32:41 -0400 (0:00:00.409) 0:25:47.957 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Monday 16 June 2025 17:32:41 -0400 (0:00:00.202) 0:25:48.159 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Monday 16 June 2025 17:32:41 -0400 (0:00:00.154) 0:25:48.313 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Monday 16 June 2025 17:32:41 -0400 (0:00:00.228) 0:25:48.542 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Monday 16 June 2025 17:32:42 -0400 (0:00:00.293) 0:25:48.835 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Monday 16 June 2025 17:32:42 -0400 (0:00:00.120) 0:25:48.955 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Monday 16 June 2025 17:32:42 -0400 (0:00:00.152) 0:25:49.108 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Monday 16 June 2025 17:32:42 -0400 (0:00:00.256) 0:25:49.364 *********** ok: [managed-node11] => { "changed": false, "stat": { "atime": 1750109533.9321618, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1750109533.9321618, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 36924, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1750109533.9321618, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Monday 16 June 2025 17:32:44 -0400 (0:00:01.451) 0:25:50.816 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Monday 16 June 2025 17:32:44 -0400 (0:00:00.239) 0:25:51.055 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Monday 16 June 2025 17:32:44 -0400 (0:00:00.217) 0:25:51.272 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Monday 16 June 2025 17:32:44 -0400 (0:00:00.164) 0:25:51.437 *********** ok: [managed-node11] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Monday 16 June 2025 17:32:45 -0400 (0:00:00.269) 0:25:51.706 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Monday 16 June 2025 17:32:45 -0400 (0:00:00.186) 0:25:51.893 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Monday 16 June 2025 17:32:45 -0400 (0:00:00.068) 0:25:51.961 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Monday 16 June 2025 17:32:45 -0400 (0:00:00.096) 0:25:52.057 *********** ok: [managed-node11] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Monday 16 June 2025 17:32:49 -0400 (0:00:04.144) 0:25:56.202 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Monday 16 June 2025 17:32:49 -0400 (0:00:00.247) 0:25:56.450 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Monday 16 June 2025 17:32:50 -0400 (0:00:00.268) 0:25:56.719 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Monday 16 June 2025 17:32:50 -0400 (0:00:00.185) 0:25:56.904 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Monday 16 June 2025 17:32:50 -0400 (0:00:00.188) 0:25:57.093 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Monday 16 June 2025 17:32:50 -0400 (0:00:00.251) 0:25:57.344 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Monday 16 June 2025 17:32:50 -0400 (0:00:00.181) 0:25:57.526 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Monday 16 June 2025 17:32:51 -0400 (0:00:00.156) 0:25:57.682 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Monday 16 June 2025 17:32:51 -0400 (0:00:00.227) 0:25:57.910 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Monday 16 June 2025 17:32:51 -0400 (0:00:00.282) 0:25:58.192 *********** ok: [managed-node11] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Monday 16 June 2025 17:32:51 -0400 (0:00:00.313) 0:25:58.505 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Monday 16 June 2025 17:32:52 -0400 (0:00:00.349) 0:25:58.854 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Monday 16 June 2025 17:32:52 -0400 (0:00:00.222) 0:25:59.077 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Monday 16 June 2025 17:32:52 -0400 (0:00:00.296) 0:25:59.373 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Monday 16 June 2025 17:32:52 -0400 (0:00:00.255) 0:25:59.629 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Monday 16 June 2025 17:32:53 -0400 (0:00:00.263) 0:25:59.893 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Monday 16 June 2025 17:32:53 -0400 (0:00:00.254) 0:26:00.147 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Monday 16 June 2025 17:32:53 -0400 (0:00:00.177) 0:26:00.324 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Monday 16 June 2025 17:32:53 -0400 (0:00:00.230) 0:26:00.555 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Monday 16 June 2025 17:32:54 -0400 (0:00:00.163) 0:26:00.718 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Monday 16 June 2025 17:32:54 -0400 (0:00:00.229) 0:26:00.948 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Monday 16 June 2025 17:32:54 -0400 (0:00:00.182) 0:26:01.130 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Monday 16 June 2025 17:32:54 -0400 (0:00:00.197) 0:26:01.327 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Monday 16 June 2025 17:32:54 -0400 (0:00:00.229) 0:26:01.557 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Monday 16 June 2025 17:32:55 -0400 (0:00:00.156) 0:26:01.713 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Monday 16 June 2025 17:32:55 -0400 (0:00:00.153) 0:26:01.867 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Monday 16 June 2025 17:32:55 -0400 (0:00:00.156) 0:26:02.023 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Monday 16 June 2025 17:32:55 -0400 (0:00:00.205) 0:26:02.228 *********** ok: [managed-node11] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Monday 16 June 2025 17:32:55 -0400 (0:00:00.279) 0:26:02.508 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Monday 16 June 2025 17:32:56 -0400 (0:00:00.259) 0:26:02.768 *********** skipping: [managed-node11] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Monday 16 June 2025 17:32:56 -0400 (0:00:00.182) 0:26:02.950 *********** skipping: [managed-node11] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Monday 16 June 2025 17:32:56 -0400 (0:00:00.183) 0:26:03.133 *********** skipping: [managed-node11] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Monday 16 June 2025 17:32:56 -0400 (0:00:00.317) 0:26:03.451 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Monday 16 June 2025 17:32:57 -0400 (0:00:00.214) 0:26:03.665 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Monday 16 June 2025 17:32:57 -0400 (0:00:00.146) 0:26:03.812 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Monday 16 June 2025 17:32:57 -0400 (0:00:00.176) 0:26:03.989 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Monday 16 June 2025 17:32:57 -0400 (0:00:00.179) 0:26:04.168 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Monday 16 June 2025 17:32:57 -0400 (0:00:00.243) 0:26:04.411 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Monday 16 June 2025 17:32:57 -0400 (0:00:00.177) 0:26:04.589 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Monday 16 June 2025 17:32:58 -0400 (0:00:00.161) 0:26:04.750 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Monday 16 June 2025 17:32:58 -0400 (0:00:00.605) 0:26:05.356 *********** skipping: [managed-node11] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Monday 16 June 2025 17:32:58 -0400 (0:00:00.171) 0:26:05.528 *********** skipping: [managed-node11] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Monday 16 June 2025 17:32:59 -0400 (0:00:00.211) 0:26:05.740 *********** skipping: [managed-node11] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Monday 16 June 2025 17:32:59 -0400 (0:00:00.227) 0:26:05.967 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Monday 16 June 2025 17:32:59 -0400 (0:00:00.173) 0:26:06.140 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Monday 16 June 2025 17:32:59 -0400 (0:00:00.196) 0:26:06.337 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Monday 16 June 2025 17:32:59 -0400 (0:00:00.191) 0:26:06.529 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Monday 16 June 2025 17:33:00 -0400 (0:00:00.138) 0:26:06.667 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Monday 16 June 2025 17:33:00 -0400 (0:00:00.169) 0:26:06.837 *********** ok: [managed-node11] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Monday 16 June 2025 17:33:00 -0400 (0:00:00.176) 0:26:07.013 *********** ok: [managed-node11] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Monday 16 June 2025 17:33:00 -0400 (0:00:00.107) 0:26:07.121 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Monday 16 June 2025 17:33:00 -0400 (0:00:00.141) 0:26:07.263 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Monday 16 June 2025 17:33:00 -0400 (0:00:00.143) 0:26:07.407 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Monday 16 June 2025 17:33:00 -0400 (0:00:00.142) 0:26:07.550 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Monday 16 June 2025 17:33:01 -0400 (0:00:00.207) 0:26:07.757 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Monday 16 June 2025 17:33:01 -0400 (0:00:00.151) 0:26:07.909 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Monday 16 June 2025 17:33:01 -0400 (0:00:00.215) 0:26:08.124 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Monday 16 June 2025 17:33:01 -0400 (0:00:00.129) 0:26:08.253 *********** skipping: [managed-node11] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Monday 16 June 2025 17:33:01 -0400 (0:00:00.142) 0:26:08.395 *********** ok: [managed-node11] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Monday 16 June 2025 17:33:01 -0400 (0:00:00.225) 0:26:08.621 *********** ok: [managed-node11] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* managed-node11 : ok=1229 changed=60 unreachable=0 failed=9 skipped=1068 rescued=9 ignored=0 SYSTEM ROLES ERRORS BEGIN v1 [ { "ansible_version": "2.9.27", "end_time": "2025-06-16T21:08:12.829756+00:00Z", "host": "managed-node11", "message": "encrypted volume 'foo' missing key/password", "start_time": "2025-06-16T21:08:07.640230+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.9.27", "end_time": "2025-06-16T21:08:12.931842+00:00Z", "host": "managed-node11", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } }, "leaves": [], "mounts": [], "msg": "encrypted volume 'foo' missing key/password", "packages": [], "pools": [], "volumes": [] }, "start_time": "2025-06-16T21:08:12.856907+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.9.27", "end_time": "2025-06-16T21:10:20.405224+00:00Z", "host": "managed-node11", "message": "cannot remove existing formatting on device 'luks-fec24d04-7e30-407c-9ca8-79d9c198f63d' in safe mode due to encryption removal", "start_time": "2025-06-16T21:10:15.096310+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.9.27", "end_time": "2025-06-16T21:10:20.662210+00:00Z", "host": "managed-node11", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'luks-fec24d04-7e30-407c-9ca8-79d9c198f63d' in safe mode due to encryption removal", "packages": [], "pools": [], "volumes": [] }, "start_time": "2025-06-16T21:10:20.481890+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.9.27", "end_time": "2025-06-16T21:12:18.374222+00:00Z", "host": "managed-node11", "message": "cannot remove existing formatting on device 'sda' in safe mode due to adding encryption", "start_time": "2025-06-16T21:12:12.901802+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.9.27", "end_time": "2025-06-16T21:12:18.612591+00:00Z", "host": "managed-node11", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'sda' in safe mode due to adding encryption", "packages": [], "pools": [], "volumes": [] }, "start_time": "2025-06-16T21:12:18.415062+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.9.27", "end_time": "2025-06-16T21:14:20.859529+00:00Z", "host": "managed-node11", "message": "encrypted volume 'test1' missing key/password", "start_time": "2025-06-16T21:14:15.209535+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.9.27", "end_time": "2025-06-16T21:14:21.195534+00:00Z", "host": "managed-node11", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "safe_mode": false, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "encrypted volume 'test1' missing key/password", "packages": [], "pools": [], "volumes": [] }, "start_time": "2025-06-16T21:14:20.912163+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.9.27", "end_time": "2025-06-16T21:16:47.294481+00:00Z", "host": "managed-node11", "message": "cannot remove existing formatting on device 'luks-676decb8-5a23-4d42-ba5c-1b3b350e8738' in safe mode due to encryption removal", "start_time": "2025-06-16T21:16:42.157999+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.9.27", "end_time": "2025-06-16T21:16:47.518172+00:00Z", "host": "managed-node11", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'luks-676decb8-5a23-4d42-ba5c-1b3b350e8738' in safe mode due to encryption removal", "packages": [], "pools": [], "volumes": [] }, "start_time": "2025-06-16T21:16:47.382257+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.9.27", "end_time": "2025-06-16T21:19:09.177810+00:00Z", "host": "managed-node11", "message": "cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption", "start_time": "2025-06-16T21:19:03.607787+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.9.27", "end_time": "2025-06-16T21:19:09.476167+00:00Z", "host": "managed-node11", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption", "packages": [], "pools": [], "volumes": [] }, "start_time": "2025-06-16T21:19:09.205779+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.9.27", "end_time": "2025-06-16T21:21:43.747636+00:00Z", "host": "managed-node11", "message": "encrypted volume 'test1' missing key/password", "start_time": "2025-06-16T21:21:37.889288+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.9.27", "end_time": "2025-06-16T21:21:43.989855+00:00Z", "host": "managed-node11", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "safe_mode": false, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "encrypted volume 'test1' missing key/password", "packages": [], "pools": [], "volumes": [] }, "start_time": "2025-06-16T21:21:43.784179+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.9.27", "end_time": "2025-06-16T21:26:44.940989+00:00Z", "host": "managed-node11", "message": "cannot remove existing formatting on device 'luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0' in safe mode due to encryption removal", "start_time": "2025-06-16T21:26:39.263854+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.9.27", "end_time": "2025-06-16T21:26:45.290381+00:00Z", "host": "managed-node11", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'luks-81fa0ad9-7bd0-4113-88f7-8e5459e0d1d0' in safe mode due to encryption removal", "packages": [], "pools": [], "volumes": [] }, "start_time": "2025-06-16T21:26:44.978008+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.9.27", "end_time": "2025-06-16T21:29:28.833670+00:00Z", "host": "managed-node11", "message": "cannot remove existing formatting on device 'test1' in safe mode due to adding encryption", "start_time": "2025-06-16T21:29:23.343611+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.9.27", "end_time": "2025-06-16T21:29:29.123451+00:00Z", "host": "managed-node11", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'test1' in safe mode due to adding encryption", "packages": [], "pools": [], "volumes": [] }, "start_time": "2025-06-16T21:29:28.878787+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" } ] SYSTEM ROLES ERRORS END v1 TASKS RECAP ******************************************************************** Monday 16 June 2025 17:33:02 -0400 (0:00:00.233) 0:26:08.855 *********** =============================================================================== fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 14.53s /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 14.12s /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 14.04s /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 13.40s /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 13.38s /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 13.33s /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 6.12s /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Make sure blivet is available ------- 5.97s /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.92s /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.89s /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.83s /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.72s /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.71s /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.70s /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Get required packages --------------- 5.68s /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 fedora.linux_system_roles.storage : Include the appropriate provider tasks --- 5.67s /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 fedora.linux_system_roles.storage : Get required packages --------------- 5.65s /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 fedora.linux_system_roles.storage : Get required packages --------------- 5.62s /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.60s /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Get required packages --------------- 5.60s /tmp/collections-EWW/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19